Following Donald Trump's hotly-debated presidential victory this month, the subject of fake news has become, well, news. Some have attributed the election results in part to the ease with which inaccurate, hyperpartisan information circulates on social media, prompting questions about why fake news is believed even when the information is clearly false or satirical — and indeed, our tendency to believe inaccurate information warrants examination, which is exactly what a recent research study has done. Because whether or not fake news actually swayed the election, the internet has made spreading misinformation easier than ever — that much is clear.
The answer, it turns out, has less to do with deliberate ignorance and more to do with the way the brain works. In a review published in Current Directions in Psychological Science, Northwestern University professor David Rapp explains why people believe inaccurate information despite knowing better. It comes down to how memories are formed. As you may know from experience, first impressions are difficult to change, and the same principle applies to information. According to Rapp, people tend to accept news at face value because it's easier than being critical, especially considering the sheer amount of information encountered every day. "It's a nightmare to critically evaluate all of it," Rapp said, according to Science Daily.
The problem arises when people later read correct information but still rely on the original "facts." Because it was encoded into memory first, the misinformation will come to mind more easily than the correction.
"If it's available, people tend to think they can rely on it. But just because you can remember what someone said, doesn't make it true," said Rapp. This is further complicated by the egalitarian nature of the internet, where facts and misinformation may be presented right on top of each other in a social media feed.
It's not necessarily that people are lazy — at least, not only that people are lazy — but that our brains prefer to conserve energy and take the fastest route. Most of the time, this speed is a good thing. On the other hand, it also could be what leads to your uncle sharing an article about Pope Francis endorsing Trump prior to the election even though the Pope did nothing of the sort, or someone believing satirical content from sites like RealTrueNews, whose creator wrote in the Daily Beast about the attention their fake news stories received during this year's election.
So that's the psychological side of things. In the wake of the election, however, many have pointed out that social media can create a bubble of sorts: The idea is that we tend to be friends with like-minded people, so we're likely to encounter news and information shared by people with similar political beliefs. Earlier this year, a study found that people tend to seek out information that confirms their own beliefs and share it online, creating what researchers called a political "echo chamber." In October, BuzzFeed published a report charting the rise in social media engagement of fake news stories in the months leading up to the election; according to the analysis, stories from prominent fake news sources generated more reactions online than those from mainstream sources.
Facebook CEO Mark Zuckerberg initially dismissed concerns over the site's political influence. "Voters make decisions based on their lived experience," he said at the Techonomy conference in early November. But in the weeks since, Facebook has updated its Audience Network policy to explicitly target fake news and will prohibit ads from these sites; similarly, Google announced it will ban fake news sites from purchasing advertising.
But as easy as it is to blame social media for political polarization, the spread of fake news owes itself to many factors, including misleading statements by politicians themselves. As the aforementioned study shows, human nature plays a significant role in believing what we read online, and it certainly doesn't hurt if that information confirms our prior conceptions. At least we're talking about fake news — what we'll do about it remains to be seen.