Why people often believe they are right, even when they're wrong
10-14-2024

Why people often believe they are right, even when they're wrong

Information shapes our perceptions and decisions, but what happens when we don’t have the whole story?

Most of us have experienced that moment in a debate when we are utterly convinced we’re right, only to later discover that we were missing key details.

This phenomenon, termed the “illusion of information adequacy,” reveals our tendency to assume we have enough data to make informed decisions, even when critical details are missing.

Recent research explores this intriguing concept, uncovering how we often believe our judgments are well-founded despite incomplete information.

The illusion of information adequacy

The illusion of information adequacy refers to the natural assumption that we have all the necessary data to make informed decisions or to defend our stances.

“We found that, in general, people don’t stop to think whether there might be more information that would help them make a more informed decision,” noted study co-author Angus Fletcher, a professor at Ohio State University.

According to Professor Fletcher, when individuals encounter a few aligned pieces of information, they’re quick to conclude, “that sounds about right,” and proceed accordingly.

Focus of the research

In an attempt to better understand this intriguing human tendency, the team analyzed the behavior of 1,261 Americans. They were divided into three groups and were given an article about a fictional school that lacked adequate water.

The first group read an article supporting the idea of merging with another school, the second group’s article advocated for staying separate, and the third control group read both arguments.

Confident decisions without the facts

The fascinating results showed that the groups with one-sided information still felt confident in their decision-making. They were inclined to follow the recommendations in the article they read.

“Those with only half the information were actually more confident in their decision to merge or remain separate than those who had the complete story,” noted Professor Fletcher.

Interestingly, those who lacked complete information even assumed that most people would make the same decision they did.

Different behavioral patterns revealed

Despite the concerning findings, Professor Fletcher shared an optimistic perspective. In the study, many of the participants who initially had one-sided information changed their minds upon encountering the other side’s argument.

However, this may not apply universally, especially regarding deeply entrenched ideological issues. Often, people might mistrust new information or reframe it to align it with their pre-existing beliefs.

The study’s findings shed light on another interesting behavioral pattern – naive realism. This refers to the belief that one’s subjective perception of a situation is the complete objective truth.

The illusion of information adequacy, however, shows that people may share the same understanding – if they both are adequately informed.

Fighting the illusion of information adequacy

Fletcher, who also investigates the impact of narratives on people, advises everyone to ensure they have the full story before assuming a stance or making a decision.

“Your first move when you disagree with someone should be to think, ‘Is there something that I’m missing that would help me see their perspective and understand their position better?’ That’s the way to fight this illusion of information adequacy,” said Fletcher.

According to the researchers, their results provide broad support for the theory that individuals maintain the illusion that they have adequate information, and this illusion impacts several downstream outcomes related to decision-making.

Information gaps and decision-making pitfalls

In decision-making, the illusion of having enough information can lead us into traps. The study demonstrates that when people receive incomplete or one-sided information, they not only feel confident but often more convinced of their stance than if they had access to the full picture.

This overconfidence stems from our tendency to fill in missing information with assumptions that reinforce our beliefs.

The research suggests that information gaps can contribute to a rigid mindset, where new insights are either dismissed or reshaped to fit an existing narrative.

By being aware of these information gaps, we can better navigate complex situations, question our assumptions, and ultimately make decisions that are more nuanced, informed, and reflective of the broader context.

The study is published in the journal PLoS ONE.

—–

Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates. 

Check us out on EarthSnap, a free app brought to you by Eric Ralls and Earth.com.

—–

News coming your way
The biggest news about our planet delivered to you each day
Subscribe