Why is the Right So Easy to Believe Fake News
Why we fall for fake news: Hijacked thinking or laziness?
Fresh research offers a new insight on why we believe the unbelievable.

Have you heard? Nancy Pelosi diverted Social Security money to fund the impeachment inquiry. President Trump's father was a member of the KKK.
Far-fetched as those statements sound, they were among the most shared fake news stories on Facebook in 2019, according to a report by the nonprofit organization Avaaz, which concluded that political fake news garnered more than 150 million views in 2019. And a study by Dartmouth University computer scientist Soroush Vosoughi, PhD, and colleagues found that fake news actually reaches more people and spreads more quickly than the truth (Science, Vol. 359, No. 6380, 2018).
And that's alarming psychologists and other behavioral researchers. "Fake news has important implications in politics, but also in areas such as health and nutrition, climate science, and financial information," says David Rand, PhD, a professor of management science and brain and cognitive sciences at MIT. "The basic question from a psychological perspective is: How can people possibly believe this stuff?"
A frequent explanation is motivated reasoning — the idea that people's cognitive processes are biased toward believing things that conform with their worldview. Hence, a liberal voter is predisposed to believe unsavory rumors about President Trump's father, while a conservative is more willing to accept that Rep. Pelosi would illegally divert public funds.
But research by Rand and colleagues challenges the idea that it's our reasoning that is biased. "The dominant explanation for why people believe fake news has been that their reasoning is held captive by partisan biases—their thinking gets hijacked," Rand says. His studies paint an alternate picture: "People who believe false things are the people that just don't think carefully," he says.
Time to think
Rand and Gordon Pennycook, PhD, an assistant professor of behavioral science at the University of Regina, in Saskatchewan, Canada, measured analytical reasoning in 3,446 American participants from Mechanical Turk. They found that higher scores on the reasoning test were associated with a better ability to distinguish fake headlines from real news headlines.
That was true even when the fake stories aligned with participants' political preferences. The authors concluded that people are more likely to fall prey to misinformation because of lazy thinking than due to any conscious or subconscious desire to protect their political identities (Cognition, Vol. 188, No. 1, 2019). In a replication study, Rand and his colleagues confirmed those results and showed the effects extend beyond blatantly false headlines to hyperpartisan headlines as well (Ross. R.M., et. al., PsyArXiv, Published online, 2019). "People [who] believed false headlines tended to be the people [who] didn't think carefully, regardless of whether those headlines aligned with their ideology," Rand says.
An experimental study showed similar results. Rand, Pennycook and Bence Bago, PhD, at the University of Toulouse Capitole, in France, presented 1,635 American participants from Mechanical Turk with a series of news items. The stories — some true, some false — appeared as they would on social media, as screenshots that showed the headline, the source and the first few sentences of a news story. First, participants were asked to make a quick judgment about whether the news was real or fake while they were holding unrelated information about a visual pattern in their working memory. Then they saw the news item again, and could take their time mulling over the veracity of the story, with no time pressure, and no working memory load to carry. When people took time to think, they improved at discerning the truth, whether or not their political identity aligned with the news (Journal of Experimental Psychology: General, Published online, 2020).
The takeaway, Rand says, is that people who scroll quickly through social media might be less susceptible to misinformation if they simply slow down to consider what they're reading. "Our findings suggest that getting people to reason more is a good thing," he says. "When you're on social media, stop and think."
Rand's new findings seem to bump up against the long-established idea that we're less critical of information that reinforces our ideology or identity. Indeed, there's plenty of evidence that partisan alignment affects our judgments. In a meta-analysis of 51 experimental studies, Peter Ditto, PhD, a professor of psychological science at the University of California, Irvine, and colleagues found that liberals and conservatives were both likely to evaluate information more favorably when it supported their own political beliefs (Perspectives on Psychological Science, Vol. 14, No. 2, 2018).
Rand doesn't contest the idea that ideology can affect our judgment. His point: "Where ideological bias exists, this bias is typically caused by intuitive processes rather than by reasoning," he says. He acknowledges that reasoning might boost bias in situations where the misinformation is more subtle, such as climate change studies in which complicated data could be spun multiple ways. "For cases that are more ambiguous, there may be more wiggle room to convince yourself of the conclusion you are motivated to believe."
On the other hand, bias seems to have less impact on reasoning when people are evaluating fake news stories that are blatantly inaccurate. "In that case, you don't have that intellectual wiggle room. Thinking more will help you get to the right answer."
Nudging toward accuracy
Steven Sloman, PhD, a cognitive scientist at Brown University who has written about the science of fake news (Lazer, D.M.J., et. al., Science, Vol. 359, No. 6380, 2018), says Rand's studies on lazy thinking raise some interesting points, but more work is needed to determine whether those findings extend to the messy real world. Participants in the study were being asked to differentiate true stories from false, which may have automatically put them into a more critical, deliberative mindset, he says.
"They're in a different frame of mind than people who are just scrolling through social media or having conversations with friends." That social piece is also an important area for further study, he adds. "People aren't information processors like computers are. We're social animals, and an important function of fake news is that it's an indicator of what other people believe."
The social component is also what makes fake news such a thorny problem to solve, says Ditto. "Whether or not people believe fake news isn't just a cognitive process. It's socially reinforced," he says. You see people who think like you, all believing the same things, and offering you new information that reinforces those beliefs. "It's a team effort, and that's where fake news gets a lot of its power," he says.
But there are ways to start diffusing that power, Rand says. "Social media platforms can nudge people to think about accuracy." In another 2019 paper, he and Pennycook showed that using crowdsourcing to rank the accuracy of news on social media could be an effective way to reduce the spread of misinformation (PNAS, Vol. 116, No. 7, 2019).
Rand, who is advising Facebook on how to use crowdsourcing, says there's a lot of room for psychologists to apply their expertise. "This is an entirely psychological question, and a place where psychology has a real opportunity to make a contribution to a problem of great public interest."
Contact APA Office of Public Affairs
Source: https://www.apa.org/news/apa/2020/fake-news
Post a Comment for "Why is the Right So Easy to Believe Fake News"