Could AI Help Us Be More Thoughtful Voters?

Democracy depends on an informed electorate. But diving into the context swirling around ballot measures–where some of today’s consequential policy questions in the country are now decided–is no easy task. The last decade has seen social media inflame passions and amplify misinformation. Can newer forms of technology nudge us to reason more carefully?
Chenhao Tan (Faculty Co-Director of Novel Intelligence at the DSI and Associate Professor of Computer Science and Data Science) and his team hope so. They’ve developed CivicChats, an AI platform designed to help voters engage more critically and thoughtfully with the issues shaping their communities.
CivicChats grows out of Tan’s research on the relationship between language, technology, and political discourse. His lab’s prior work has used computational tools to study how political speech divides and persuades — including research that developed novel metrics for measuring the divisiveness and uniqueness of presidential rhetoric. That body of work diagnosed how existing technology like social media tends to reinforce division rather than support deliberation. CivicChats is, in many ways, a response to that diagnosis: a tool built around the question of what AI should do to support democratic participation, not just what it can do.
The team, which includes collaborators at the Australian National University, began with the premise that good civic reasoning involves understanding what’s at stake, grappling with competing considerations, and examining the values driving your own position. Many popular large language models fall short of this standard, tending toward sycophancy, simply agreeing with users rather than challenging or clarifying their thinking. But without meaningful pushback, these tools do little to encourage balanced reasoning.
“Endorsements replace deliberation rather than facilitate it,” said Mourad Heddaya, a PhD candidate in Computer Science and member of the CivicChats team. “Campaign messaging is designed to persuade, not clarify. And AI assistants tend to accept your frame and move too quickly toward a tidy answer. We wanted to build something that actually sits with the tensions in a political question.”
Building a chatbot that genuinely supports civic reasoning meant paying careful attention to what makes a productive political conversation. The team designed CivicChats to push back on user positions and probe reasoning rather than simply agreeing. To systematically assess how well it does this, the team also built CivicEval, an evaluation framework that reviews conversations against structured rubrics, assessing whether the chatbot is being evenhanded, appropriately challenging, and avoiding sycophantic tendencies.
The CivicChats platform offers three conversation modes depending on what users are looking for:
- Q&A mode helps users understand what a measure does and what its main considerations are, presenting relevant information evenhandedly without favoring one side.
- Argumentative mode presents strong arguments opposing the user’s position, helping them consider alternative perspectives and stress-test their views.
- Reflective mode asks questions about what values are driving a user’s reaction, what their position depends on, and what it would take to change their view.
Users can browse measures on their state’s ballot or search nationally by topic or status to learn more about an issue and the policy options under consideration. As they discuss a measure, they can record their position — yes, no, or undecided — and update it as their thinking develops.
Whether good chatbot behavior translates into better outcomes for voters is a question the team plans to investigate next. A preregistered user study is underway to compare CivicChats’ three modes against non-chatbot baselines across several measures of civic reasoning: voter understanding of ballot measures and their tradeoffs, decision confidence, and the quality of participants’ justifications for their positions.
The team is actively seeking partners and participants for the study. If you’re interested, you can try CivicChats at civicchats.org or reach out to Mourad Heddaya (mourad@uchicago.edu) to learn more.
People

Chenhao Tan

