Emerging Technologies

Why science fatigue keeps us clinging to bad habits

Daniel du Plooy
Share:
Our Impact
What's the World Economic Forum doing to accelerate action on Emerging Technologies?
The Big Picture
Explore and monitor how Neuroscience is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
Stay up to date:

Neuroscience

This article is published in collaboration with The Conversation. Publication does not imply endorsement of views by the World Economic Forum.

The World Health Organization (WHO) threw the cat among the pigeons last week with a new report linking eating red and processed meat to cancer.

It didn’t claim our way of life is killing us, but it would seem this way from the reactions. Agriculture Minister Barnaby Joyce, for instance, said the WHO would have humans living in caves were we to follow all its recommendations.

This response is all too familiar and highlights the public’s fundamental misunderstanding of how science works. Two issues stand in the way of, and often override, sensible interpretations of research findings – science fatigue and confirmation bias.

Science fatigue

The media constantly bombards us with the latest research on a plethora of topics without much nuance on its quality or relevance.

Last year red wine was good, this year it’s bad. Last month lots of water was good, this month it’s bad. Today you need more protein, tomorrow you need more carbohydrates.

This apparent seesaw in health journalism creates science fatigue in the public mind. The underlying science for most of these reports is sound, but as a New England Journal of Medicine editorial suggests, the reporting is often irresponsible and out to click-bait an unsuspecting public:

A problem that is worsening in this era of the 24/7 news cycle is the frequent failure to put new developments into any kind of reasonable context for readers or viewers. In this environment, reporters become little more than headline readers or conduct interviews that amount to a “hit and run” version of journalism.

The constant hype leads to distrust and erodes the integrity of scientific research. How can science be trusted if it can’t make up its mind?

All too often the distinction between scientific opinion and fact is not clear. Effectively engaging the public in often specialised scientific findings is a work in progress and has been a challenge for the media, governments and science for some time.

A 2000 United Kingdom report into the country’s mad cow disease outbreak in the 1990s concluded that a government department had provided inappropriate technical advice about the link between contaminated beef and human health. It said the departments’ communication had provoked an “irrational public scare”.

A barrage of similar instances has created a crying wolf scenario, particularly when journalists and public relations operators report certain studies as the final word. When the real wolf appears (like last week’s WHO meat evaluation) we brush it away as insignificant and continue our existing behaviours.

Confirmation bias

Recently a family friend pronounced that his grandmother smoked all her life and reached the ripe old age of 90, so he is not worried about his “moderate” smoking habit. His grandmother may have had the potential to reach 120 as a non-smoker, but numerous other variables could have influenced the final result for her.

All too often, we base important health decisions on personal anecdotal experience. The plural of anecdote is not data, yet we grasp at any straw that reinforces our own opinions so we can maintain our status quo. This is called confirmation bias.

In an extensive review of this phenomenon, American psychologist Raymond Nickerson contends it might in fact be the single most problematic aspect of human reasoning.

…once one has taken a position on an issue, one’s primary purpose becomes that of defending or justifying that position. This is to say that regardless of whether one’s treatment of evidence was evenhanded before the stand was taken, it can become highly biased afterward.

Numerous studies have explained confirmation bias as it applies to all kinds of fundamental situations. For instance, we tend to seek out sources of information likely to reinforce what we already believe in, and we interpret the evidence in ways that support what we already believe.

Even the pressure to publish can create a bias in scientists which influences the objectivity and integrity of research.

A review of publications and related biases by the British National Institute for Health Research found that studies with significant or favourable results were more likely to be published or cited that those with non-significant or unfavourable results.

When our meat eating – which is seen as such a fundamental part of our existence, our culture, our economy and maybe even our identity – is attacked, we resort to confirmation bias and often use personal anecdotes as a counter attack.

Certainly anecdotes in health care shouldn’t be ignored, but they need to be understoodtogether with formal, research evidence.

Scientists aren’t exempt

The American Dietetic Association holds the position that meat is not required for a healthy diet. Yet we have heard many experts say otherwise. In some cases, this could be because it is part of the social fabric of our society, and scientists aren’t exempt from bias.

A recent study noted that when scientists were put in situations where they were expected to be an expert or see themselves as experts, they tended to over-estimate the accuracy of their own beliefs.

Even if these beliefs stem from a knowledge in their field, the tendency to cling to prior opinions increases the likelihood of bias.

Thankfully, once we are able to overcome our fatigue and biases, and reasonably consider the latest evidence, we can steer ourselves in a direction where the risk of cancer is lower without any knee-jerk reactions.

To keep up with the Agenda subscribe to our weekly newsletter.

Author: Daniel du Plooy is a PhD candidate at the Australian Research Centre in Sex, Health and Society (ARCSHS), La Trobe University.

Image: Participants rest before a scientific seminar at the European Organization for Nuclear Research (CERN) in Meyrin, near Geneva July 2012. REUTERS.

 

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Related topics:
Emerging TechnologiesWellbeing and Mental Health
Share:
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

AI and energy: Will AI help reduce emissions or increase demand? Here's what to know

Eleni Kemene, Bart Valkhof and Thapelo Tladi

July 22, 2024

About Us

Events

Media

Partners & Members

  • Sign in
  • Join Us

Language Editions

Privacy Policy & Terms of Service

© 2024 World Economic Forum