The Debunking Handbook Part 3: The Overkill Backfire Effect

The Debunking Handbook is a guide to debunking myths, by John Cook and Stephan Lewandowsky. Although there is a great deal of psychological research on misinformation, unfortunately there is no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of misinformation. This Handbook boils down the research into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation.

This is part three in a five-part series cross-posted from Skeptical Science.

One principle that science communicators often fail to follow is making their content easy to process. That means easy to read, easy to understand and succinct. Information that is easy to process is more likely to be accepted as true.1 Merely enhancing the colour contrast of a printed font so it is easier to read, for example, can increase people’s acceptance of the truth of a statement.2

Common wisdom is that the more counter-arguments you provide, the more successful you’ll be in debunking a myth. It turns out that the opposite can be true. When it comes to refuting misinformation, less can be more. Debunks that offered three arguments, for example, are more successful in reducing the influence of misinformation, compared to debunks that offered twelve arguments which ended up reinforcing the myth.1

The Overkill Backfire Effect occurs because processing many arguments takes more effort than just considering a few. A simple myth is more cognitively attractive than an over-complicated correction.

The solution is to keep your content lean, mean and easy to read. Making your content easy to process means using every tool available. Use simple language, short sentences, subheadings and paragraphs. Avoid dramatic language and derogatory comments that alienate people. Stick to the facts.

End on a strong and simple message that people will remember and tweet to their friends, such as “97 out of 100 climate scientists agree that humans are causing global warning”; or “Study shows that MMR vaccines are safe.” Use graphics wherever possible to illustrate your points.

Scientists have long followed the principles of the Information Deficit Model, which suggests that people hold erroneous views because they don’t have all the information. But too much information can backfire. Adhere instead to the KISS principle: Keep It Simple, Stupid!

The Debunking Handbook, a guide to debunking misinformation, is now freely available to download. Although there is a great deal of psychological research on misinformation, there’s no summary of the literature that offers practical guidelines on the most effective ways of reducing the influence of myths. The Debunking Handbook boils the research down into a short, simple summary, intended as a guide for communicators in all areas (not just climate) who encounter misinformation.


  1. Schwarz, N., Sanna, L., Skurnik, I., & Yoon, C. (2007). Metacognitive experiences and the intricacies of setting people straight:Implications for debiasing and public information campaigns. Advances in Experimental Social Psychology, 39, 127-161.
  2. Reber, R., Schwarz, N. (1999). Effects of Perceptual Fluency on Judgments of Truth, Consciousness and Cognition, 8, 338-3426.

4 Responses to The Debunking Handbook Part 3: The Overkill Backfire Effect

  1. Mike Roddy says:

    This is all interesting, but the science here is not always robust.

    For example, if a denier is thoroughly humiliated, yes he will respond forcefully, and refuse to budge an inch. However, there are two problems with this result: the audience matters, not just the denier whose falsehoods are being addressed.

    The second problem is time. If a denier is thoroughly humiliated, this information triggers mental system rebooting. After this occurs, the debunking can be quite successful, but discarding foundational beliefs requires a gestation period.

    Finally, deniers must not be only countered with facts, and not just because they don’t respond to them. In many cases, the denier ouevre is poisoned from top to bottom. Chiseling away at its many distortions becomes a tedious process. It’s better sometimes to just go ahead and attack the rotten foundation. This strategy adheres to the keep it simple notion Cook’s people expressed, and can change the minds of those who think more holistically.

  2. Ernest says:

    This is interesting and makes a lot of sense. Usually, there’s more at work than a free intellectual exchange. There’s a whole cognitive framework, value system, and now political affiliation associated with the topic of climate change. It’s pretty dug in. This makes changing of minds during the conversation unlikely. The best that can be accomplished may be to give the person pause. Something inoffensive, matter of fact, non-argumentative, maybe appealing to one of their trusted authorities, such as “the Pentagon sees climate change as a threat multiplier”.

    At this point I don’t see much change coming from argumentation. The “information” is pretty much out there, for both sides. People pretty much choose what they want to believe. More sustained weird weather can
    begin to shift attitudes. So can economic damage. The drone of scientific articles on climate change in newspapers can also give pause. So can international concern, an increasing number of businesses concerned about mitigating their risks, things that have material effect irrespective of belief. It may not be necessary to convince the “hard core”. Convincing the “soft core” may be more effective. The hard core denialists will find the whole world shifting around them.

  3. Geoff Beacon says:

    My off-the-cuff comment in Part 2 on MMR made me look at the controversy again. I’ve a friend that is keen on Dr Mercola and some of his stuff rings true for me so I Googled “Dr Mercola MMR”. Consequence: I’m really starting to panic about conventional science after reading his One of the Most Dangerous ‘Drinks’ You Can Give Your Child.

    Having read DR Mercola, I just don’t think it is true to say “MMR vaccines are safe.”

    The panic is this: Have I been taken in by an MMR science denier just as many are taken in by climate science deniers?

    I find it hard to accept the message of the Debunking Handbook because I think they are painting a very incomplete summary of the MMR issue. Does the handbook have a message about destroying confidence in this way?

    P.S. Where should I have put the following to keep this effective?

    It may be true to say “Study shows that MMR vaccines are safe.” It may even be true to say “It is safer for the whole of society if everybody has the MMR vaccine” perhaps even “It is safer for an individual to have an MMR vaccine than not have one.”

    Does a postscript confuse the message more than putting this in the text?

  4. Anna Haynes says:

    An unfortunate example: Berkeley’s Evolution FAQ.