Emerging Technologies

Confirmation bias explains why the pundits aren’t always right

Plaster phrenological models of heads, showing different parts of the brain, are seen at an exhibition at the Wellcome Collection in London March 27, 2012.  We've pickled it, dessicated it, drilled it, mummified it, chopped it and sliced it over centuries, yet as the most complex entity in the known universe, the human brain remains a mysterious fascination. With samples of Albert Einstein's preserved brain on slides, and specimens from other famous and infamous heads such as the English mathematician Charles Babbage and notorious mass murderer William Burke, an exhibition opening in London this week is seeking to tap into that intrigue. The exhibition Brains: The Mind As Matter runs from March 29 to June 17. REUTERS/Chris Helgren       (BRITAIN - Tags: SCIENCE TECHNOLOGY SOCIETY HEALTH) - RTR2ZYK1

Ray Nickerson looks at 'confirmation bias'. Image: REUTERS/Chris Helgren

Ray Nickerson
Research Professor of Psychology, Tufts University

As post mortems of the 2016 presidential election began to roll in, fingers started pointing to what psychologists call the confirmation bias as one reason many of the polls and pundits were wrong in their predictions of which candidate would end up victorious.

Confirmation bias is usually described as a tendency to notice or search out information that confirms what one already believes, or would like to believe, and to avoid or discount information that’s contrary to one’s beliefs or preferences. It could help explain why many election-watchers got it wrong: in the runup to the election, they saw only what they expected, or wanted, to see.

Psychologists put considerable effort into discovering how and why people sometimes reason in less than totally rational ways. The confirmation bias is one of the better-known of the biases that have been identified and studied over the past few decades. A large body of psychological literature reports how confirmation bias works and how widespread it is.

The role of motivation

Confirmation bias can appear in many forms, but for present purposes, we may divide them into two major types. One is the tendency, when trying to determine whether to believe something is true or false, to look for evidence that it is true while failing to look for evidence that it is false.

Imagine four cards on a table, each one showing either a letter or number on its visible side. Let’s say the cards show A, B, 1 and 2. Suppose you are asked to indicate which card or cards you would have to turn over in order to determine whether the following statement is true or false: If a card has A on its visible side, it has 1 on its other side. The correct answer is the card showing A and the one showing 2. But when people are given this task, a large majority choose to turn either the card showing A or both the card showing A and the one showing 1. Relatively few see the card showing 2 as relevant, but finding A on its other side would prove the statement to be false. One possible explanation for people’s poor performance of this task is that they look for evidence that the statement is true and fail to look for evidence that it is false.

Another type of confirmation bias is the tendency to seek information that supports one’s existing beliefs or preferences or to interpret data so as to support them, while ignoring or discounting data that argue against them. It may involve what is best described as case building, in which one collects data to lend as much credence as possible to a conclusion one wishes to confirm.

At the risk of oversimplifying, we might call the first type of bias unmotivated, inasmuch as it doesn’t involve the assumption that people are driven to preserve or defend their existing beliefs. The second type of confirmation bias may be described as motivated, because it does involve that assumption. It may go a step further than just focusing on details that support one’s existing beliefs; it may involve intentionally compiling evidence to confirm some claim.

It seems likely that both types played a role in shaping people’s election expectations.

Case building versus unbiased analysis

An example of case building and the motivated type of confirmation bias is clearly seen in the behavior of attorneys arguing a case in court. They present only evidence that they hope will increase the probability of a desired outcome. Unless obligated by law to do so, they don’t volunteer evidence that’s likely to harm their client’s chances of a favorable verdict.

Another example is a formal debate. One debater attempts to convince an audience that a proposition should be accepted, while another attempts to show that it should be rejected. Neither wittingly introduces evidence or ideas that will bolster one’s adversary’s position.

In these contexts, it is proper for protagonists to behave in this fashion. We generally understand the rules of engagement. Lawyers and debaters are in the business of case building. No one should be surprised if they omit information likely to weaken their own argument. But case building occurs in contexts other than courtrooms and debating halls. And often it masquerades as unbiased data collection and analysis.

Where confirmation bias becomes problematic

One sees the motivated confirmation bias in stark relief in commentary by partisans on controversial events or issues. Television and other media remind us daily that events evoke different responses from commentators depending on the positions they’ve taken on politically or socially significant issues. Politically liberal and conservative commentators often interpret the same event and its implications in diametrically opposite ways.

Anyone who followed the daily news reports and commentaries regarding the election should be keenly aware of this fact and of the importance of political orientation as a determinant of one’s interpretation of events. In this context, the operation of the motivated confirmation bias makes it easy to predict how different commentators will spin the news. It’s often possible to anticipate, before a word is spoken, what specific commentators will have to say regarding particular events.

Here the situation differs from that of the courtroom or the debating hall in one very important way: Partisan commentators attempt to convince their audience that they’re presenting a balanced factual – unbiased – view. Presumably, most commentators truly believe they are unbiased and responding to events as any reasonable person would. But the fact that different commentators present such disparate views of the same reality makes it clear that they cannot all be correct.

Selective attention

Motivated confirmation bias expresses itself in selectivity: selectivity in the data one pays attention to and selectivity with respect to how one processes those data.

When one selectively listens only to radio stations, or watches only TV channels, that express opinions consistent with one’s own, one is demonstrating the motivated confirmation bias. When one interacts only with people of like mind, one is exercising the motivated confirmation bias. When one asks for critiques of one’s opinion on some issue of interest, but is careful to ask only people who are likely to give a positive assessment, one is doing so as well.

This presidential election was undoubtedly the most contentious of any in the memory of most voters, including most pollsters and pundits. Extravagant claims and counterclaims were made. Hurtful things were said. Emotions were much in evidence. Civility was hard to find. Sadly, “fallings out” within families and among friends have been reported.

The atmosphere was one in which the motivated confirmation bias would find fertile soil. There is little doubt that it did just that and little evidence that arguments among partisans changed many minds. That most pollsters and pundits predicted that Clinton would win the election suggests that they were seeing in the data what they had come to expect to see – a Clinton win.

None of this is to suggest that the confirmation bias is unique to people of a particular partisan orientation. It is pervasive. I believe it to be active independently of one’s age, gender, ethnicity, level of intelligence, education, political persuasion or general outlook on life. If you think you’re immune to it, it is very likely that you’ve neglected to consider the evidence that you’re not.

Loading...
Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Neuroscience

Share:
The Big Picture
Explore and monitor how Neuroscience is affecting economies, industries and global issues
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

Here’s why it’s important to build long-term cryptographic resilience

Michele Mosca and Donna Dodson

December 20, 2024

How digital platforms and AI are empowering individual investors

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum