Never give up: The persistence of misinformation effects

Politicians, corporations, journalists and even scientists sometimes do it – they tell people things that later on turn out to be incorrect. Yet, getting rid of this so-called misinformation is often easier said than done as false beliefs are particularly sticky. In this blog, I zoom in on the current state of the art in misinformation research.

In 1998 Andrew Wakefield published an article claiming that the Measles Mums Rubia vaccination was associated with an increased risk of developing autism  (Wakefield et al., 1998). The media got a hold of Wakefield’s findings and soon spread these around the world causing many parents to refrain from vaccinating their children. Later, however, it became clear that Wakefield had distorted his data which consequently led to the retraction of his article (Cohen & Falco, 2011). This elucidation notwithstanding, many parents continue to refuse vaccinating their children because of potential side-effects (OECD, 2012).

There are plenty of cases like Wakefield’s. They are not only limited to the field of health but have also extended to the political and commercial world. Often, the media play an essential role in both spreading and perpetuating misinformation. For example, information that reaches the public as being markedly true is subsequently asked to be corrected (Ecker, Lewandowsky, Fenton, & Martin, 2014). Clearly, misinformation can impact our beliefs and decisions - and more often than not within the blink of an eye. Yet, when it comes to “refurbishing” the truth, attempts to do so are often easier said then done, as people will continue to believe in the allegedly wrong information. Assuming that false information can have far reaching consequences – both at the individual- and societal level (such as a pandemic breakout or an (unjustified) invasion of a country into another) – what are the reasons that people hold so fiercely on to these false beliefs? And more importantly, what makes some people more prone to fall prey to misinformation?

Spinning the mis-information-network

One explanation is that misinformation repeatedly circulates in the media. As a result, people can more easily retrieve this information without much effort. Simply put: The more often you read or hear something in the media, the more likely you are to recall it. By the time the correct information gains public attention, misinformation has become established in people’s mind and as a result, it is given persistent priority when people make inferences or formulate their beliefs (Lewandowsky, Ecker, Seifert, Schwarz, & Cook, 2012).

Another explanation is that exposure to information causes people to create beliefs which often function as “naïve” theories that give meaning to what they see or hear (Anderson & Lindsay, 1998). Attempts to correct invalid information cause these beliefs to become cavernous and thereby leave the person in a state of uncertainty. Uncertainty arises because the person is no longer able to make sense of a situation – not to mention, to determine what is right and wrong. As uncertainty is inherently unpleasant (e.g., Tiedens & Linton, 2001), the person will often revert back to the incorrect information in order to seek comfort and stability. In fact, there are several studies showing that corrective attempts are doomed to fail if they simply negate initial misinformation (e.g., Seifert, 2002). Rather, effectively countering misinformation depends to a large extent on whether the corrective attempt is plausible and coherent (see for review, Lewandowsky et al., 2012).

Bending backfire effects

But belief perseverance can be more directly influenced by people’s pre-existing attitudes or ideologies. For instance, Nyhan and Reifler (2010) showed in one of their experiments that attempts to correct false allegations made by the Bush administration caused political conservatives to agree with these allegations even more than those without any corrective information. In other words, the correction strengthened rather than gave up their beliefs.

However, attitudes do not always create such “backfire effects” as was demonstrated by Ecker and his colleagues (2014). In their first experiment, participants were invited to read one of three fabricated news articles that dealt with a robbery. Two of these articles stressed that the suspect was an Aboriginal Australian but only one of them actually included a retraction of this information. The other article dealt with a Caucasian suspect. Afterwards people were asked some questions about the article which revealed that people regardless of their racial prejudicial beliefs were equally affected by the retraction. That is, their answers contained fewer references to the false suspect. The researchers explained that the reason why corrective attempts fail will largely depend on whether people are asked to change their attitudes regarding an issue. In absence of these, people will most likely internalize the correct information.

Be skeptical and aware

Arguably, there are some cases of misinformation which are neither important nor pose a direct threat to our life – such as when an author incorrectly cites a source (e.g., Seife (2012)) or when you participate in an experimental study where a researcher intentionally deceives you to test her hypotheses. Certainly, these circumstances do not necessarily ask for an immediate change of our mind or behavior.

Yet, there are numerous circumstances – may it be a decision to buy a fair trade product, get a vaccination or vote for somebody – where changing your beliefs or behaviors are inevitable and even crucial to your life. Under these circumstances, it is beneficial to have “tools” at hand that may help you to arm yourself against falling prey to misinformation in the first place. Such a tool, for instance, is skepticism. Skeptical people are less gullible because they are not only more likely to scrutinize information they hear or read (Ditto & Lopez, 1992) but also more willing to seek additional information (Sinaceur, 2010). Another tool is cultivating an increased awareness that most of the information that we consume is not necessarily edited or fact-checked by external sources before it is actually distributed (Ecker, Lewandowsky, & Tang, 2010; Lewandowsky et al., 2012). This happens sometimes because of ignorance or limited means to do so and sometimes volitional as when a person pursues a specific agenda (e.g., politicians).

Still, while skepticism and an increased awareness may shield us to some extent from misinformation, neither of them embodies a “bullet-proof” panacea. In fact, the same tools may make you even more vulnerable to misinformation. As some researchers (e.g., Taber & Lodge, 2006) argue – it takes time and effort to acquire a certain belief which makes people unwilling to give it up very easily.

So what then is the solution? From my point of view, we cannot avoid the confrontation with incorrect information – especially in our times where some of us enjoy unlimited access to information. But what we certainly can do is to be aware that not everything that we encounter is veridical – just like our own views that are oftentimes driven by our own need for certainty.

References:

Anderson, C. A., & Lindsay, J. J. (1998). The development, perseverance, and change of naive theories. Social Cognition, 16(1), 8-30.

Cohen, E., & Falco, M. (2011, 11 October 2014). Retracted autism study an 'elaborate fraud,' British journal finds Retrieved 11 October 2014, from http://edition.cnn.com/2011/HEALTH/01/05/autism.vaccines/

Ditto, P. H., & Lopez, D. F. (1992). Motivated skepticism: use of differential decision criteria for preferred and nonpreferred conclusions. Journal of Personality and Social Psychology, 63(4), 568-584.

Ecker, U. K., Lewandowsky, S., Fenton, O., & Martin, K. (2014). Do people keep believing because they want to? Preexisting attitudes and the continued influence of misinformation. Memory & Cognition, 42(2), 292-304.

Ecker, U. K., Lewandowsky, S., & Tang, D. T. (2010). Explicit warnings reduce but do not eliminate the continued influence of misinformation. Memory & Cognition, 38(8), 1087-1100.

Hogan, P. (2011). Misinformation [photograph]  Retrieved October 5, 2014, from Flickr Commons: https://www.flickr.com/photos/petehogan/6072467398/

Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303-330.

OECD. (2012). Childhood vaccination. Retrieved 11 October 2014 http://www.oecd.org/els/family/CO1.4 Childhood vaccination - updated 081212.pdf

Seife, C. (Producer). (2012, 11 October 2014). Jonah Lehrer’s Journalistic Misdeeds at Wired.com. Retrieved from http://goo.gl/s4WK4U. 

Seifert, C. M. (2002). The continued influence of misinformation in memory: What makes a correction effective? Psychology of Learning and Motivation, 41, 265-292.

Sinaceur, M. (2010). Suspending judgment to create value: Suspicion and trust in negotiation. Journal of Experimental Social Psychology, 46(3), 543-550.

Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755-769.

Tiedens, L. Z., & Linton, S. (2001). Judgment under emotional certainty and uncertainty: The effects of specific emotions on information processing. Journal of Personality and Social Psychology, 81(6), 973-988.

Wakefield, A. J., Murch, S. H., Anthony, A., Linnell, J., Casson, D., Malik, M., . . . Harvey, P. (1998). RETRACTED: Ileal-lymphoid-nodular hyperplasia, non-specific colitis, and pervasive developmental disorder in children. The Lancet, 351(9103), 637-641.