April 26, 2024

Fake News and Bots May Be Worrisome, but Their Political Power Is Overblown

Advertisement

Supported by

It’s very hard to change people’s minds, especially when so many are already committed partisans.

Image
A book about “Fake News” was displayed last November by a supporter of Roy Moore, who unsuccessfully ran for Senate in Alabama.CreditBrynn Anderson/Associated Press

By

Feb. 13, 2018

How easy is it to change people’s votes in an election?

The answer, a growing number of studies conclude, is that most forms of political persuasion seem to have little effect at all.

This conclusion may sound jarring at a time when people are concerned about the effects of the false news articles that flooded Facebook and other online outlets during the 2016 election. Observers speculated that these so-called fake news articles swung the election to Donald J. Trump. Similar suggestions of large persuasion effects, supposedly pushing Mr. Trump to victory, have been made about online advertising from the firm Cambridge Analytica and content promoted by Russian bots.

Much more remains to be learned about the effects of these types of online activities, but people should not assume they had huge effects. Previous studies have found, for instance, that the effects of even television advertising (arguably a higher-impact medium) are very small. According to one credible estimate, the net effect of exposure to an additional ad shifts the partisan vote of approximately two people out of 10,000.

In fact, a recent meta-analysis of numerous different forms of campaign persuasion, including in-person canvassing and mail, finds that their average effect in general elections is zero.

Field experiments testing the effects of online ads on political candidates and issues have also found null effects. We shouldn’t be surprised — it’s hard to change people’s minds! Their votes are shaped by fundamental factors like which party they typically support and how they view the state of the economy. “Fake news” and bots are likely to have vastly smaller effects, especially given how polarized our politics have become.

Here’s what you should look for in evaluating claims about vast persuasion effects from dubious online content:

How many people actually saw the questionable material. Many alarming statistics have been produced since the election about how many times “fake news” was shared on Facebook or how many times Russian bots retweeted content on Twitter. These statistics obscure the fact that the content being shared may not reach many Americans (most people are not on Twitter and consume relatively little political news) or even many humans (many bot followers may themselves be bots).

Whether the people being exposed are persuadable. Dubious political content online is disproportionately likely to reach heavy news consumers who already have strong opinions. For instance, a study I conducted with Andrew Guess of Princeton and Jason Reifler of the University of Exeter in Britain showed that exposure to fake news websites before the 2016 election was heavily concentrated among the 10 percent of Americans with the most conservative information diets — not exactly swing voters.

The proportion of news people saw that is bogus. The total number of shares or likes that fake news and bots attract can sound enormous until you consider how much information circulates online. Twitter, for instance, reported that Russian bots tweeted 2.1 million times before the election — certainly a worrisome number. But these represented only 1 percent of all election-related tweets and 0.5 percent of views of election-related tweets.

Similarly, my study with Mr. Guess and Mr. Reifler found that the mean number of articles on fake news websites visited by Trump supporters was 13.1, but only 40 percent of his supporters visited such websites, and they represented only about 6 percent of the pages they visited on sites focusing on news topics.

None of these findings indicate that fake news and bots aren’t worrisome signs for American democracy. They can mislead and polarize citizens, undermine trust in the media, and distort the content of public debate. But those who want to combat online misinformation should take steps based on evidence and data, not hype or speculation.


Brendan Nyhan is a professor of government at Dartmouth College. Follow him on Twitter at @BrendanNyhan.

Advertisement

Article source: https://www.nytimes.com/2018/02/13/upshot/fake-news-and-bots-may-be-worrisome-but-their-political-power-is-overblown.html?partner=rss&emc=rss

Speak Your Mind