News chatbots that present multiple viewpoints tend to earn the trust of conspiracy believers

A recent study published in the journal Computers in Human Behavior suggests that automated news chatbots programmed to deliver balanced viewpoints can earn the trust of people with varying ideological backgrounds. The research provides evidence that individuals who hold strong conspiracy beliefs tend to respond well to these chatbots, viewing them as useful tools for reading diverse news. These findings point to new ways technology might help pierce information bubbles and reduce societal division by exposing people to multiple perspectives.

In recent years, generative artificial intelligence has transformed how people interact with information online. Generative artificial intelligence refers to computer systems that can process massive amounts of text and generate human-like responses. News chatbots rely on similar technology to act as automated conversational agents. These programs allow users to browse topics, providing real-time text summaries of news articles in a chat window.

The authors behind the new study wanted to see if these chatbots could help solve a growing problem in the modern media landscape. People often engage in selective exposure, meaning they only click on news that matches their existing beliefs. Over time, this habit creates an echo chamber, which tends to increase political and social polarization.

When people are only exposed to one side of a story, they often become defensive or dismissive of alternative viewpoints. The scientists wanted to know if a neutral, automated chatbot could encourage people to step outside their comfort zones. They suspected that people might view a machine as more objective than a human journalist.

“People who believe in conspiracy theories tend to distrust mainstream media, seeing it as biased or agenda-driven,” said study author Shreya Dubey (@sdubey03), a postdoctoral researcher in the Amsterdam School of Communication Research at the University of Amsterdam.

“We wanted to test whether a chatbot, which may be perceived as more neutral than a traditional news outlet, might be better received by this group. We designed a chatbot that presented both mainstream and alternative news articles, then looked at whether conspiracy believers were more willing to trust and use it compared to people who don’t hold such beliefs.”

Specifically, the scientists developed a custom chatbot named Infobot. This program was designed to present users with eight different news headlines about climate change.

Four of the headlines represented mainstream scientific perspectives supporting climate action. The other four headlines represented alternative viewpoints, including arguments against climate action and narratives that framed climate change as a hoax. Users could scroll through the headlines and click on any article to read a brief summary generated by the chatbot.

After reading a summary, that article disappeared, prompting the user to choose another. The software tracked which articles the users selected and how much time they spent reading them. In the first study, the scientists recruited a sample of 177 adult residents of the United States.

They split these participants into two groups based on their responses to a questionnaire about general conspiracy theories. The final sample included 93 individuals with low generic conspiracy beliefs and 84 individuals with high generic conspiracy beliefs. Participants were instructed to interact with Infobot and read at least four article summaries.

Afterward, they answered survey questions rating the chatbot on its ease of use, perceived usefulness, and potential risks. They also rated their overall trust in the program, their general attitude toward it, and their intention to use such a tool in the future. The data showed that participants who found the chatbot useful and trustworthy tended to have a positive attitude toward it.

This positive attitude directly predicted their intention to use news chatbots again. Unexpectedly, the scientists found that people with high generic conspiracy beliefs trusted the chatbot more than those with low conspiracy beliefs. The high-belief group also reported a more positive attitude and a greater intention to use the program in the future.

Both groups read a similar number of mainstream and alternative articles. However, the software revealed that individuals with higher conspiracy beliefs spent significantly less time actually reading the mainstream summaries compared to the alternative ones. The researchers noticed a potential flaw in their first study.

They had grouped people based on their belief in general conspiracies, rather than their specific beliefs about climate change. In fact, the two groups did not significantly differ in their actual belief in human-caused global warming. To fix this, the scientists conducted a second study.

For the second study, the researchers recruited 58 participants. This time, they specifically screened for beliefs about climate change. The sample included 35 individuals with low climate change conspiracy beliefs and 23 individuals with high climate change conspiracy beliefs.

The procedure was nearly identical to the first experiment. However, participants had to enter a special code from the chatbot to prove they had paid attention to the summaries. The second study replicated the findings of the first.

Once again, trust and perceived usefulness predicted a positive attitude toward the chatbot. Participants with high climate change conspiracy beliefs trusted the chatbot more and showed a greater intention to use it than those with low conspiracy beliefs. The scientists noted that both groups generally responded positively to the program, but the high-belief group was consistently more enthusiastic.

The researchers suspect this happens because individuals with strong conspiracy beliefs often feel that mainstream media is biased against them. Because the chatbot presented their alternative views on equal footing with mainstream science, they likely viewed the machine as a fair and unbiased source of information.

“Most of us, regardless of our beliefs, tend to think we’ve formed our opinions objectively and from good information,” Dubey told PsyPost. “Our findings suggest that a chatbot presenting multiple perspectives feels refreshingly balanced to people across the board, including those who distrust mainstream media.”

“But this raises an uncomfortable question: is balance always desirable? Climate change is not genuinely contested among scientists, yet our chatbot presented mainstream and alternative views side by side. While this approach made the tool widely accepted, it also risks creating a false equivalence. That is, giving fringe or misleading viewpoints the same weight as scientific consensus. The very feature that made our chatbot appealing could, if applied carelessly, end up legitimising misinformation.”

“So the real takeaway is a tension worth sitting with: tools that feel balanced and neutral may be our best shot at reaching people across ideological divides, but ‘balance’ on issues like climate change is not a neutral act in itself,” Dubey said.

While the findings offer hope for reducing polarization, the researchers noted several limitations. First, the studies only compared people at the extreme ends of the conspiracy belief spectrum. Individuals with moderate beliefs were excluded from the main analysis, which means the results might not represent the entire population. Second, the participants only interacted with the chatbot one time in a controlled survey environment.

It is unclear if their positive attitudes would persist after repeated use over weeks or months. It also remains to be seen if people would voluntarily choose to use a balanced news chatbot in the real world when they have access to highly personalized social media feeds.

Future research should investigate exactly which features of the chatbot make it appealing to different groups. Scientists could also explore whether giving users some control over the ratio of mainstream to alternative news might increase their willingness to engage with opposing viewpoints.

The study, “Investigating perceived trust and utility of balanced news chatbots among individuals with varying conspiracy beliefs,” was authored by Shreya Dubey, Paul E. Ketelaar, Tilman Dingler, Hannah K. Peetz, and Hein T. van Schie.

Leave a comment
Stay up to date
Register now to get updates on promotions and coupons
HTML Snippets Powered By : XYZScripts.com

Shopping cart

×