In the world of fact-checking, a phenomenon sometimes known as “the backfire effect” can flummox a journalist who’s worked diligently to develop an airtight fact-check.

FCP_blackMany social science studies have found that people who hold very tightly to an ideology or their own set of “facts” cannot only be unswayed by the most obvious of facts — they often dig in and cling even more to erroneous beliefs.

And therefore, the most irrefutable fact-check has backfired.

Brendan Nyhan, an assistant professor at Dartmouth College who is working with the American Press Institute on its new fact-checking initiative, has studied the impact of fact-checking in politics and other venues. His most recent study revealed a surprising reaction from some parents who were presented with facts that refuted claims about the dangers of childhood vaccines. Parents with the least favorable attitudes toward vaccines became even more resistant after hearing evidence against the “vaccines cause autism” myth.

In today’s Q&A, Nyhan talks about the study and what it means for the impact of fact-checking.

You’ve said you found the results of the vaccine/parents study depressing. What reactions have you gotten from others in the research/scientific community?

I’m not the only person to find the results depressing, unfortunately; that’s been the principal response I’ve heard thus far. At the same time, I’m encouraged that people are becoming more aware of the risks posed by vaccine-preventable disease and the need to test the messaging that’s used to promote vaccines.

Human psychology isn’t going to change any time soon — it will always be hard to admit that we’re wrong!

Besides being depressed, the most common response has been to ask what a better approach might be, but our study provides no direct guidance — none of the interventions increased intention to vaccinate a future child. Our recommendation is that more studies be done with a particular focus on the pediatrician-parent relationship, which may be the most promising context for alleviating parents’ concerns. There is some interesting research going on in this area but more is needed.

In politics, people like the diehard anti-vaccination parents in your recent study are called “high partisans” because they hold unwaveringly to a belief system. Are they a force to be reckoned with? Or are there so few of them that they could be considered “outliers”  who will always cling to misperceptions, despite the best efforts of fact-checkers?  

BrendanNyhan

Brendan Nyhan

Yes, high levels of partisanship are quite prevalent — over 30 percent of the public self-identified as a “strong Democrat” or “strong Republican” in 2008, for example. Unlike in the past, however, these partisan affiliations are now closely linked to people’s ideological views. As a result, it may be especially difficult to change those people’s minds about misperceptions that are ideological or partisan in nature.

In a 2010 study, for instance, my co-author Jason Reifler and I found that people frequently resisted corrective information that was inconsistent with their ideological views and in some cases came to believe in the misperception in question even more strongly.

Does a person’s resistance to being “corrected” by the facts have anything to do with level of education?  Does having a post-graduate degree, for example, impact whether the person will cling to misinformation?

The general pattern is that the relationship between education or political knowledge and misperceptions is negative but inconsistent. People who are more educated or know more about politics will tend to have more accurate perceptions on average, but will also be better at aligning their factual beliefs with their political views and at resisting unwelcome information, which can sometimes reverse the expected relationship.

For instance, the political scientist John Sides found that belief in the myth that President Obama is Muslim went up more among people with some college or a college degree than those with a high school degree or less between 2009 and 2010. Likewise, I found in a 2012 survey that Republicans who had high levels of political knowledge were more likely to believe in the conspiracy theory that the Obama administration had manipulated the unemployment statistics than those who were less knowledgeable.

This isn’t the first time that studies have unearthed the resilience of misinformation. Can you imagine a scenario where this could change? Or will it always be human nature?

Human psychology isn’t going to change any time soon — it will always be hard to admit that we’re wrong! Moreover, the incentives for individuals to hold accurate beliefs in politics are very weak. That’s why I advocate holding elites accountable for spreading misinformation; we know that politicians are highly risk-averse and that the stakes are much higher for them than for voters. Even if fact-checking fails to change many minds, I believe it can still help change elite behavior on the margin.

You might also be interested in: