Why People Think They’re Always Right

New study identifies an intriguing human tendency — the “illusion of information adequacy.” This phenomenon leads people to believe they have all the facts needed to make a sound decision, even when crucial information is missing.

If you find yourself confidently asserting your opinion in a debate, a new study reveals you might be more mistaken than you think. Researchers from The Ohio State University, Johns Hopkins University and Stanford University have identified an intriguing human tendency they call the “illusion of information adequacy.” This phenomenon leads people to believe they have all the facts needed to make a sound decision, even when crucial information is missing.

“We found that, in general, people don’t stop to think whether there might be more information that would help them make a more informed decision,” co-author Angus Fletcher, a professor of English at The Ohio State University and member of the university’s Project Narrative, said in a news release.

Published in PLOS ONE, the research involved an experiment with 1,261 American participants.

They were divided into three groups and provided with an article about a fictional school facing a water shortage. The first group read only arguments for merging the struggling school with another that had adequate resources. The second group read arguments for keeping the schools separate and seeking alternative solutions. The third control group received both sets of arguments.

The results showed that participants who only read one side of the argument were more confident in their decisions than those who read both perspectives.

“Those with only half the information were actually more confident in their decision to merge or remain separate than those who had the complete story,” Fletcher added. “They were quite sure that their decision was the right one, even though they didn’t have all the information.”

Adding to this, the study highlighted that participants with partial information thought others would likely agree with their decision, reinforcing their confidence in their choice.

However, there is a silver lining. The study found that once participants who initially read only one side of the argument were exposed to the opposing view, many reconsidered their stance.

This suggests that while people may initially operate under the illusion of information adequacy, they can still change their minds when presented with a fuller picture.

Fletcher pointed out that this kind of cognitive bias is different from “naïve realism,” which deals with the belief that one’s perception of a situation is the absolute truth. Naïve realism usually involves more ideological issues, where people may resist new information that clashes with their established views.

“But most interpersonal conflicts aren’t about ideology. They are just misunderstandings in the course of daily life,” Fletcher added.