Debunking: Why a fact-check isn’t enough
What would you usually do when you realize someone’s beliefs are incorrect? Quite often, we try to correct the other person. We explain to them that they are wrong and why. If you show them enough proof that their beliefs are false, they will have to change their beliefs, right? Yet, research shows that simply providing evidence that information is incorrect – also known as “debunking” – is often not enough to convince people to change their beliefs. A challenge that we’re facing here stems from the “continued influence effect”, which means that fact-checking doesn’t fully put an end to the influences of disinformation, even though people might agree with the debunking messages. Why is that the case?
To understand events, people make “mental models”, including their thoughts and representation of what is happening in the world around them. For people to believe a story is true, they prefer the story to be logical, coherent, and complete. When parts of their mental model get debunked, people will be left with a gap in their perception of events. Subsequently, when asked about this event, people might use disinformation to make a coherent story. Summarily, though people agree with the fact-check, they may continue to use disinformation to make sense of the world. Another explanation of the continued influence effect can be found in how we remember things. For example, people might confuse disinformation with corrected information. To successfully replace disinformation, debunking messages should be detailed enough for people to fill the gap in their mental models and abandon the incorrect information entirely. They should therefore always include an alternative explanation for the situation that is plausible, detailed and well-argued.
Even if people do manage to correct their beliefs about whether the information is true or false, it is possible that their feelings about the topic remain the same. A study on false statements and voting intentions illustrates this. In the study, people were presented with true and false statements by Donald Trump, and they were asked to rate their beliefs in those statements. They were then presented with corrections of the false statements and confirmation of the correct ones. As a response, people corrected their belief ratings. You would assume that this also impacted their voting intentions. Yet, even though Trump supporters managed to update their beliefs about whether the information was true or false, their voting intentions and feelings towards him remained the same.
Thus, research shows that debunking messages spread slower than disinformation itself and that it’s a challenge to correct disinformation in hindsight. Moreover, debunking is often costly, labour-intensive and slow. Debunking can be effective under the right circumstances, but it has not been effective enough to solve the disinformation crisis thus far.