The Debunking Handbook is a guide to debunking myths, by John Cook and Stephan Lewandowsky. This Handbook boils down the psychological research on misinformation into a short, simple summary, intended as a guide for communicators in all areas (not just climate).
This is part four in a five-part series cross-posted from Skeptical Science.
The third and arguably most potent backfire effect occurs with topics that tie in with people’s worldviews and sense of cultural identity. Several cognitive processes can cause people to unconsciously process information in a biased way. For those who are strongly fixed in their views, being confronted with counter-arguments can cause their views to be strengthened.
One cognitive process that contributes to this effect is Confirmation Bias, where people selectively seek out information that bolsters their view. In one experiment, people were offered information on hot-button issues like gun control or affirmative action. Each parcel of information was labelled by its source, clearly indicating whether the information would be pro or con (e.g., the National Rifle Association vs. Citizens Against Handguns). Although instructed to be even-handed, people opted for sources that matched their pre-existing views. The study found that even when people are presented with a balanced set of facts, they reinforce their pre-existing views by gravitating towards information they already agree with. The polarisation was greatest among those with strongly held views.1
What happens when you remove that element of choice and present someone with arguments that run counter to their worldview? In this case, the cognitive process that comes to the fore is Disconfirmation Bias, the flipside of Confirmation Bias. This is where people spend significantly more time and thought actively arguing against opposing arguments.2