Chatbots are not impartial - they tell us what we want to hear
05-17-2024

Chatbots are not impartial - they tell us what we want to hear

New research led by Johns Hopkins University suggests that chatbots, often perceived as neutral, may actually reinforce existing ideologies and contribute to increased polarization on contentious issues. 

This study challenges the common belief that chatbots provide impartial information and reveals how conversational search systems might deepen societal divisions and make individuals more susceptible to manipulation.

Biased responses from chatbots 

“Because people are reading a summary paragraph generated by AI, they think they’re getting unbiased, fact-based answers,” said lead author Ziang Xiao, an assistant professor of computer science at Johns Hopkins who specializes in human-AI interactions. 

“Even if a chatbot isn’t designed to be biased, its answers reflect the biases or leanings of the person asking the questions. So really, people are getting the answers they want to hear.”

How chatbots affect online search behaviors 

The scientists explored how chatbots affect online search behaviors by comparing interactions with different search systems. 

Participants, numbering 272, were asked to express their initial thoughts on various controversial topics like health care, student loans, or sanctuary cities. They were then directed to gather more information using either a chatbot or a traditional search engine developed for the study. 

After reviewing the search results, participants wrote a second essay and responded to questions about the topic. They were also exposed to two articles with opposing viewpoints and asked to evaluate the trustworthiness of the information and the extremity of the views.

The study revealed that because chatbots provide a narrower scope of information and echo users’ pre-existing beliefs, participants who used them were more entrenched in their original positions and reacted more strongly against opposing viewpoints.

Echo chamber effect

“People tend to seek information that aligns with their viewpoints, a behavior that often traps them in an echo chamber of like-minded opinions,” explained Xiao. “We found that this echo chamber effect is stronger with the chatbots than traditional web searches.”

This effect arises partly because of how users interact with chatbots – posing full questions instead of keywords, leading to responses that are narrowly focused on either the benefits or drawbacks of an issue, based on the phrasing of the question.

“With chatbots, people tend to be more expressive and formulate questions in a more conversational way. It’s a function of how we speak,” Xiao said. “But our language can be used against us.”  

Potential for a more polarized society 

AI developers can design chatbots to detect clues in questions that indicate a user’s preferences and biases, allowing the AI to tailor its responses accordingly. For example, the researchers tested a chatbot programmed with a hidden agenda to agree with users, which intensified the echo chamber effect.

Attempts to mitigate this effect by training a chatbot to offer contrary opinions proved ineffective, as did efforts to encourage fact-checking by linking to source information – few participants followed these links.

“Given AI-based systems are becoming easier to build, there are going to be opportunities for malicious actors to leverage AIs to make a more polarized society. Creating agents that always present opinions from the other side is the most obvious intervention, but we found they don’t work,” said Xiao.

More about chatbots 

Chatbots are computer programs designed to simulate conversation with human users, primarily through the internet. They are widely used in customer service to provide users with instant responses to inquiries and to handle simple tasks, freeing up human agents for more complex issues. 

Chatbots are built using various technologies, including predefined scripts and artificial intelligence. AI chatbots, in particular, can learn from interactions and improve over time, becoming more efficient in understanding and responding to user requests. 

They are implemented in various platforms, such as websites, messaging apps, and virtual assistants, making them accessible and convenient for a broad range of uses. 

From booking appointments to providing weather updates and supporting e-commerce transactions, chatbots have become an integral part of digital interaction.

The findings of the study were presented at the Association of Computing Machinery’s CHI conference on Human Factors in Computing Systems.

—–

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

—–

News coming your way
The biggest news about our planet delivered to you each day
Subscribe