Peoples' impulses to share news on social media fuels misinformation, says MIT study
Social media platforms have created an environment in which people are being distracted. Image: Unsplash/camstejim
Peter Dizikes
Writer, MIT News Office- People’s desire to share news content on social media can hinder their ability to judge it accurately, according to an MIT study.
- The researchers showed people a series of true and false headlines and asked them to consider only whether they wanted to share the story, or whether it was accurate.
- They also ran tests where they asked people to consider accuracy and whether they wanted to share content at the same time.
- It appears possible that thinking about sharing headlines creates a distracted condition that detracts from people’s ability to tell truth from falsity, the researchers say.
As a social media user, you can be eager to share content. You can also try to judge whether it is true or not. But for many people it is difficult to prioritize both these things at once.
That’s the conclusion of a new experiment led by MIT scholars, which finds that even considering whether or not to share news items on social media reduces people’s ability to tell truths from falsehoods.
The study involved asking people to assess whether various news headlines were accurate. But if participants were first asked whether they would share that content, they were 35 percent worse at telling truths from falsehoods. Participants were also 18 percent less successful at discerning truth when asked about sharing right after evaluating them.
“Just asking people whether they want to share things makes them more likely to believe headlines they wouldn’t otherwise have believed, and less likely to believe headlines they would have believed,” says David Rand, a professor at the MIT Sloan School of Management and co-author of a new paper detailing the study’s results. “Thinking about sharing just mixes them up.”
The results suggest an essential tension between sharing and accuracy in the realm of social media. While people’s willingness to share news content and their ability to judge it accurately can both be bolstered separately, the study suggests the two things do not positively reinforce each other when considered at the same time.
“The second you ask people about accuracy, you’re prompting them, and the second you ask about sharing, you’re prompting them,” says Ziv Epstein, a PhD student in the Human Dynamics group at the MIT Media Lab and another of the paper’s co-authors. “If you ask about sharing and accuracy at the same time, it can undermine people’s capacity for truth discernment.”
The paper, “The social media context interferes with truth discernment,” is published today in Science Advances. The authors are Epstein; Nathaniel Sirlin, a research assistant at MIT Sloan; Antonio Arechar, a professor at the Center for Research and Teaching in Economics in Mexico; Gordon Pennycook, an associate professor at the University of Regina; and Rand, who is the Erwin H. Schell Professor, a professor of management science and of brain and cognitive sciences, and the director of MIT’s Applied Cooperation Team.
To carry out the study, the researchers conducted two waves of online surveys of 3,157 Americans whose demographic characteristics approximated the U.S. averages for age, gender, ethnicity, and geographic distribution. All participants use either Twitter or Facebook. People were shown a series of true and false headlines about politics and the Covid-19 pandemic, and were randomly assigned to two groups. At times they were asked only about accuracy or only about sharing content; at other times they were asked about both, in differing orders. From this survey design, the scholars could determine the effect that being asked about sharing content has on people’s news accuracy judgments.
In conducting the survey, the researchers were exploring two hypotheses about sharing and news judgements. One possibility is that being asked about sharing could make people more discerning about content because they would not want to share misleading news items. The other possibility is that asking people about sharing headlines feeds into the generally distracted condition in which consumers view news while on social media, and therefore detracts from their ability to tell truth from falsity.
“Our results are different from saying, ‘If I told you I was going to share it, then I say I believe it because I don’t want to look like I shared something I don’t believe,’” Rand says. “We have evidence that that’s not what is going on. Instead, it’s about more generalized distraction.”
How is the World Economic Forum shaping the future of media?
The research also examined partisan leanings among participants and found that when it came to Covid-19 headlines, being prompted about sharing affected the judgment of Republicans more than Democrats, although there was not a parallel effect for political news headlines.
“We don’t really have an explanation for that partisan difference,” Rand says, calling the issue “an important direction for future research.”
As for the overall findings, Rand suggests that, as daunting as the results might sound, they also contain some silver linings. One conclusion of the study is that people’s belief in falsehoods may be more influenced by their patterns of online activity than by an active intent to deceive others.
“I think there’s in some sense a hopeful take on it, in that a lot of the message is that people aren’t immoral and purposely sharing bad things,” Rand says. “And people aren’t totally hopeless. But more it’s that the social media platforms have created an environment in which people are being distracted.”
Eventually, the researchers say, those social media platforms could be redesigned to create settings in which people are less likely to share misleading and inaccurate news content.
“There are ways of broadcasting posts that aren’t just focused on sharing,” Epstein says.
He adds: “There’s so much room to grow and develop and design these platforms that are consistent with our best theories about how we process information and can make good decisions and form good beliefs. I think this is an exciting opportunity for platform designers to rethink these things as we take a step forward.”
The project was funded, in part, by the MIT Sloan Latin America Office; the Ethics and Governance of Artificial Intelligence Initiative of the Miami Foundation; the William and Flora Hewlett Foundation; the Reset initiative of Luminate; the John Templeton Foundation; the TDF Foundation; the Canadian Institutes of Health Research; the Social Sciences and Humanities Research Council of Canada; the Australian Research Council; Google; and Facebook.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Media, Entertainment and Sport
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.
More on Industries in DepthSee all
Jane Sun
December 18, 2024