February 11, 2012

Global Change Institute

Green Army: Research and Development


Contents


The Debunking Handbook


GLOBAL CHANGE INSTITUTE


University of Queensland.

  • The Debunking Handbook, St Lucia, Australia: University of Queensland, Version 2, 23 January 2012.
    ISBN 978-0-646-56812-6.
    John Cook.
    Stephan Lewandowsky: School of Psychology, University of Western Australia.

    Debunking the first myth about debunking


    Unless great care is taken, any effort to debunk misinformation can inadvertently reinforce the very myths one seeks to correct.
    To avoid these “backfire effects”, an effective debunking requires three major elements.
    • First, the refutation must focus on core facts rather than the myth to avoid the misinformation becoming more familiar.
    • Second, any mention of a myth should be preceded by explicit warnings to notify the reader that the upcoming information is false.
    • Finally, the refutation should include an alternative explanation that accounts for important qualities in the original misinformation.

    [D]emocratic societies [depend] on accurate information. …

    A common misconception [is] that public misperceptions are due to a lack of knowledge and[, therefore,] the solution is more information [— the so-called] “information deficit model”. …

    [To be effective] communicators need to understand
    • how people process information,
    • how they modify their existing knowledge and
    • how worldviews affect their ability to think rationally.
    It’s not just what people think that matters, but how they think.

    [“Misinformation” refers] to any information that people have acquired that turns out to be incorrect, irrespective of why and how that information was acquired in the first place.
    We are concerned with the cognitive processes that govern how people process corrections to information they have already acquired …
    [If] you find out that something you believe is wrong, how do you update your knowledge and memory? …

    [In] a 1994 experiment … people were exposed to misinformation about a fictitious warehouse fire, then given a correction clarifying the parts of the story that were incorrect.
    Despite remembering and accepting the correction, people still showed a lingering effect, referring to the misinformation when answering questions about the story. …

    The evidence indicates that no matter how vigorously and repeatedly we correct the misinformation … the influence remains detectable.

    [Indeed, attempts to debunk] a myth can actually strengthen it in people’s minds.
    Several [such] “backfire effects” have been observed, arising
    • from making myths more familiar,
    • from providing too many arguments, or
    • from providing evidence that threatens one’s worldview.
    (p 1)


    The Familiarity Backfire Effect


    [People] were shown a flyer that debunked common myths about flu vaccines.
    Afterwards, they were asked to separate the myths from the facts.
    When asked immediately after reading the flyer, people successfully identified the myths.
    However, when queried 30 minutes after reading the flyer, some people actually scored worse after reading the flyer.
    The debunking reinforced the myths.

    Ideally, avoid mentioning the myth altogether while correcting it.
    When seeking to counter misinformation, the best approach is to focus on the facts you wish to communicate.

    [If not] mentioning the myth … not a practical option. …
    Your debunking should begin with emphasis on the facts, not the myth.
    Your goal is to increase people’s familiarity with the facts.
    (p 2)


    The Overkill Backfire Effect


    When … refuting misinformation, less can be more.
    Generating three arguments … can be more successful in reducing misperceptions than generating twelve arguments, which can end up reinforcing the initial misperception.

    [This] occurs because processing many arguments takes more effort than just considering a few.
    A simple myth is more cognitively attractive than an over-complicated correction.
    • [Keep your content] easy to process [by using] simple language, short sentences, subheadings and paragraphs.
    • Avoid dramatic language and derogatory comments [i.e. s]tick to the facts.
    • [Use graphics wherever possible …]
    • End on a strong and simple message …
    Writing at a simple level runs the risk of sacrificing the complexities and nuances [hence] Skeptical Science [publishes] rebuttals at several levels.
    Basic versions are written using short, plain English text and simplified graphics.
    More technical Intermediate and Advanced versions are also available with more technical language and detailed explanations.
    (p 3)


    Filling the gap with an alternative explanation


    [In] an experiment in which people read a fictitious account of a warehouse fire [mention] was made of paint and gas cans along with explosions.
    Later … it was clarified that paint and cans were not present at the fire.
    Even when people remembered and accepted this correction, they still cited the paint or cans when asked questions about the fire.
    When asked,
    “Why do you think there was so much smoke?”,
    people routinely invoked the oil paint despite having just acknowledged it as not being present.
    [{W}hen an alternative explanation involving lighter fluid and accelerant was provided, people were less likely to cite the paint and gas cans when queried about the fire.]

    [P]eople prefer an incorrect model over an incomplete model.
    In the absence of a better explanation, they opt for the wrong explanation. …
    The most effective way to reduce the effect of misinformation is to provide an alternative explanation …

    [Similarly,] in fictional murder trials [accusing] an alternative suspect greatly reduced the number of guilty verdicts … compared to defences that [only] explained why the defendant wasn’t guilty. …
    When you debunk a myth, you create a gap in the person’s mind.
    To be effective, your debunking must fill that gap

    [The] most effective [rebuttal structure combines] an alternative explanation [with] an explicit warning [given before mentioning the myth]. …

    When people read a refutation that conflicts with their beliefs, they seize on ambiguities to construct an alternative interpretation.
    Graphics provide more clarity and less opportunity for misinterpretation.
    When self-identified Republicans were surveyed about their global warming beliefs, a significantly greater number accepted global warming when shown a graph of temperature trends compared to those who were given a written description.
    (p 4)


    The Worldview Backfire Effect


    For those who are strongly fixed in their views, being confronted with counter-arguments can cause their views to be strengthened.

    One cognitive process that contributes to this effect is Confirmation Bias …
    In one experiment, people were offered information on … gun control … labelled by its source … eg, the National Rifle Association vs. Citizens Against Handguns …
    [When] presented with a balanced set of facts, [the subjects reinforced] their pre-existing views by gravitating towards information they already agree with.
    The polarisation was greatest among those with strongly held views.

    [When presented] with arguments that run counter to their worldview … the cognitive process that comes to the fore is Disconfirmation Bias … where people spend significantly more time and thought actively arguing against opposing arguments.

    [W]hen Republicans who believed Saddam Hussein was linked to the 9/11 terrorist attacks were provided with evidence that there was no link between the two [only] 2% of participants changed their mind (although interestingly, 14% denied that they believed the link in the first place).
    The vast majority clung to the link between Iraq and 9/11 …
    The most common response was attitude bolstering — bringing supporting facts to mind while ignoring any contrary facts [— resulting in a] strengthening [of] erroneous belief. …

    [T]his suggests that
    • outreaches should be directed towards the undecided majority rather than the unswayable minority.
    • [messages] be presented in ways that reduce the usual psychological resistance.
    • For example,
      • when worldview-threatening messages are coupled with … self-affirmation [people are more receptive to messages that might otherwise threaten their worldviews.]
        [This] can be achieved by asking people to write a few sentences about a time when they felt good about themselves because they acted on a value that was important to them. …
        [This] “self-affirmation effect” is strongest among those whose ideology was central to their sense of self-worth.
      • [“"Framing” the information] in a way that is [least] threatening …
        For example, Republicans are far more likely to accept an otherwise identical charge as a “carbon offset” than as a “tax”, whereas the wording has little effect on Democrats or Independents — because their values are not challenged by the word “tax”.
    (p 4)


    Anatomy of an effective debunking


    Core facts

    [A] refutation should emphasise the facts, not the myth:
    [97 out of 100 climate experts agree humans are causing global warming.
    Several independent surveys find 97% of climate scientists who are actively publishing peer-reviewed climate research agree that humans are causing global warming.]
    Present only key facts to avoid an Overkill Backfire Effect.


    Explicit warnings

    [Before] any mention of a myth, text or visual cues should warn that the upcoming information is false:
    [However, movements that deny a scientific consensus have always sought to cast doubt on the fact that a consensus exists.
    One technique is the use of fake experts, citing scientists who have little to no expertise in the particular field of science.
    For example, the OISM Petition Project claims 31,000 scientists disagree with the scientific consensus on global warming.]

    Alternative explanation

    [A]ny gaps left by the debunking need to be filled:
    [However, around 99.9% of the scientists listed in the Petition Project are not climate scientists.
    The petition is open to anyone with a Bachelor of Science or higher and includes medical doctors, mechanical engineers and computer scientists.]
    This may be achieved by providing an alternative causal explanation for why the myth is wrong and, optionally, why the misinformers promoted the myth in the first place.


    Graphics

    [Core] facts should be displayed graphically if possible.
    (p 6)

    Would you like to know more?

No comments:

Post a Comment