MIT Sloan study finds similar solutions to slow the spread of online misinformation can be effective in countries worldwide.
CAMBRIDGE, Mass., June 29, 2023 /PRNewswire/ -- Since the 2016 U.S. Presidential election and British "Brexit" referendum — and then COVID-19 — opened the floodgates on fake news, academic research has delved into the psychology behind online misinformation and suggested interventions that might help curb the phenomenon.
Most studies have focused on misinformation in the West, even though misinformation is very much a dire global problem. In 2017, false information on Facebook was implicated in genocide against the Rohingya minority group in Myanmar, and in 2020 at least two dozen people were killed in mob lynchings after rumors spread on WhatsApp in India.
In a new paper published today in Nature Human Behaviour, "Understanding and Combating Misinformation Across 16 Countries on Six Continents," MIT Sloan School of Management professor David Rand, found the traits shared by misinformation spreaders are surprisingly similar worldwide. With a dozen colleagues from universities across the globe — including lead author Professor Antonio A. Arechar of the Center for Research and Teaching in Economics in Aguascalientes, Mexico, and joint senior author Professor Gordon Pennycook of Cornell University — the researchers also found common strategies and solutions to slow the spread of fake news, which can be effective across countries.
"There's a general psychology of misinformation that transcends cultural differences," Rand said. "And broadly across many countries, we found that getting people to think about accuracy when deciding what to share, as well as showing them digital literacy tips, made them less likely to share false claims."
The study, conducted in early 2021, found large variations across countries in how likely participants were to believe false statements about COVID-19 — for example, those in India believed false claims more than twice as much as those in the U.K. Overall, people from individualistic countries with open political systems were better at discerning true statements from falsehoods, the researchers found. However, in every country, they found that reflective analytic thinkers were less susceptible to believing false statements than people who relied on their gut.
"Pretty much everywhere we looked, people who were better critical thinkers and who cared more about accuracy were less likely to believe the false claims," Rand said. In the study, people who got all three questions wrong on a critical thinking quiz were twice as likely to believe false claims compared to people who got all three questions correct. Valuing democracy was also consistently associated with being better at identifying truth.
On the other side, endorsing individual responsibility over government support and belief in God were linked to more difficulty in discerning true from false statements. People who said they would definitely not get vaccinated against COVID-19 when the vaccine became available were 52.9% more likely to believe false claims versus those who said they would.
The researchers also found that at the end of the study, while 79% of participants said it's very or extremely important to only share accurate news, 77% of them had nonetheless shared at least one false statement as part of the experiment.
What explains this striking disconnect? "A lot of it is people just not paying attention to accuracy," Rand said. "In a social media environment, there's so much to focus on around the social aspects of sharing — how many likes will I get, who else shared this — and we have limited cognitive bandwidth. So people often simply forget to even think about whether claims are true before they share them."
This observation suggests ways to combat the spread of misinformation. Participants across countries who were nudged to focus on accuracy by evaluating the truth of an initial non-COVID-related question were nearly 10% less likely to share a false statement. Very simple digital literacy tips help focus attention on accuracy too, said Rand, and those who read them were 8.4% less likely to share a false statement. The researchers also discovered that when averaging as few as 20 participant ratings, it was possible to identify false headlines with very high accuracy, leveraging the "wisdom of crowds" to help platforms spot misinformation at scale.
Methodology
In their study, Rand and colleagues recruited social media users from 16 countries across South America, Asia, the Middle East, and Africa. Australia and the United States were also among the countries, and nine languages were represented. Of the final sample size of 34,286, 45% were female and the mean age was 38.7 years old.
The researchers randomly assigned study participants to one of four groups: Accuracy, Sharing, Prompt, and Tips. All participants were shown 10 false and 10 true COVID-19-related statements taken from the websites of health and news organizations including the World Health Organization and the BBC. For example, one false statement read, "Masks, Gloves, Vaccines, And Synthetic Hand Soaps Suppress Your Immune System," while a true one said, "The likelihood of shoes spreading COVID-19 is very low."
Accuracy group participants were asked to assess statements' accuracy on a six-point scale, while the Sharing group rated how likely they would be to share statements. Study participants in the Prompt group were first asked to evaluate the accuracy of a non-COVID-related statement — "Flight attendant slaps crying baby during flight" — before deciding whether or not to share COVID statements. Finally, the Tips group read four simple digital literacy tips before they chose whether or not to share.
Implications for policy
The research has implications for policymakers and organizations like the United Nations looking to combat misinformation around the globe, as well as for social media companies trying to cut down on fake news sharing. "Really for anyone doing anti-disinformation work," Rand said. "This research helps us better understand who is susceptible to misinformation, and it shows us there are globally relevant interventions to help."
Social media companies could regularly ask users to rate the accuracy of headlines as a way to prompt them to focus on accuracy and share more thoughtfully, while also gathering useful data to help actually identify misinformation. Future research should test these types of interventions in field experiments on social media platforms around the world, Rand said. "If you focus people back on accuracy, they are less likely to share bad content," he concluded.
Media Contact:
Casey Bayer
MIT Sloan School of Management
Director of Media Relations
[email protected]
914.584.9095
SOURCE MIT Sloan School of Management
WANT YOUR COMPANY'S NEWS FEATURED ON PRNEWSWIRE.COM?
Newsrooms &
Influencers
Digital Media
Outlets
Journalists
Opted In
Share this article