AI Tool Helps People with Opposing Views Find Common Ground in Public Policy Deliberations

Science

Artificial intelligence is showing promise in helping people with opposing views find common ground. A new tool developed by Google DeepMind uses AI to mediate discussions and synthesise viewpoints into balanced summaries. In recent trials, participants favoured the AI-generated summaries over those created by human mediators, suggesting that AI could be an effective mediator for complex discussions. This was highlighted in a study published in Science on October 17, 2024.

The system, named the Habermas Machine after philosopher Jürgen Habermas, was created by a team led by Christopher Summerfield, Research Director at the UK AI Safety Institute. The AI tool was tested with groups of participants who discussed topics related to UK public policy. The AI was tasked with summarising their differing views into a coherent, representative statement. These summaries were then rated by participants. Surprisingly, 56% preferred the AI-generated summaries, with the remainder favouring the human mediator’s version. Independent reviewers also gave higher scores to the AI-generated summaries for fairness and clarity.

AI in Citizen Deliberations

Summerfield explained that the AI model was designed to produce summaries that received the most endorsement from group members, by learning from their preferences. While this is an encouraging step towards improving deliberative processes, it also highlights the challenges of scaling such democratic initiatives.

The research involved an experiment with 439 UK residents, where participants shared opinions on public policy questions. Each group’s responses were processed by the AI, which then generated a combined summary. Notably, the study found that the AI tool enhanced agreement among group members, indicating potential for policy-making in real-world settings.

Ethan Busby, an AI researcher at Brigham Young University, sees this as a promising direction, but others, like Sammy McKinney from the University of Cambridge, urge caution regarding the reduced human connection in such debates.

Articles You May Like

Arizona is getting 200 MW of Tesla battery storage to meet rising energy demand
Police search for British woman who went missing in Poland over a week ago
England vs Japan: Kick-off time, how to watch, team news
‘The money wasn’t in our accounts because it didn’t exist’: Son’s anger after mum jailed in second Post Office scandal
Trump on day one will be ‘like nothing you’ve seen in history’, warns campaign official