When corrections strengthen false beliefs
What happens when you show someone evidence that their belief is wrong?
The rational expectation: they update their belief.
What Nyhan & Reifler found in 2010: sometimes, corrections strengthen the very belief they're meant to correct.
This is the Backfire Effect — when challenging someone's deeply-held beliefs with contradictory evidence makes them believe the falsehood even more strongly.
Note: Recent research has challenged how common this effect is. But understanding when and why it happens remains crucial for anyone trying to correct misinformation.
Your initial belief: -
After seeing the correction, how strongly do you believe the original claim?
Here's how the corrections affected your beliefs:
The Original Research: Nyhan & Reifler (2010) found that when conservatives read a correction stating Iraq had no WMDs, they became more likely to believe Iraq had WMDs than before reading the correction.
Why It Happens:
A 2019 meta-analysis ("The Elusive Backfire Effect") tested 52 issues with 10,000+ participants and found backfire effects are rare. Corrections usually work. But when beliefs are deeply tied to worldview and identity, backfire remains a risk.
What Works Better: Affirming someone's values before correcting, providing alternative explanations, avoiding repetition of the myth, and framing corrections positively rather than negatively.
References: Nyhan, B. & Reifler, J. (2010). "When Corrections Fail." Political Behavior. Wood, T. & Porter, E. (2019). "The Elusive Backfire Effect." Political Behavior.