Visitors Now:
Total Visits:
Total Stories:
Profile image
By Alton Parrish (Reporter)
Contributor profile | More stories
Story Views

Now:
Last Hour:
Last 24 Hours:
Total:

Misinformation: Psychological Science Shows Why It Sticks And How To Fix It

Thursday, September 20, 2012 3:22
% of readers think this story is Fact. Add your two cents.

(Before It's News)

 

Childhood vaccines do not cause autism. Barack Obama was born in the United States. Global warming is confirmed by science. And yet, many people believe claims to the contrary.

Why does that kind of misinformation stick? A new report published in Psychological Science in the Public Interest, a journal of the Association for Psychological Science, explores this phenomenon. Psychological scientist Stephan Lewandowsky of the University of Western Australia and colleagues highlight the cognitive factors that make certain pieces of misinformation so “sticky” and identify some techniques that may be effective in debunking or counteracting erroneous beliefs.

Stephan Lewandowsky 

Credit:  University of Western Australia


The main reason that misinformation is sticky, according to the researchers, is that rejecting information actually requires cognitive effort. Weighing the plausibility and the source of a message is cognitively more difficult than simply accepting that the message is true – it requires additional motivational and cognitive resources. If the topic isn’t very important to you or you have other things on your mind, misinformation is more likely to take hold.

And when we do take the time to thoughtfully evaluate incoming information, there are only a few features that we are likely to pay attention to: Does the information fit with other things I believe in? Does it make a coherent story with what I already know? Does it come from a credible source? Do others believe it?

Misinformation is especially sticky when it conforms to our preexisting political, religious, or social point of view. Because of this, ideology and personal worldviews can be especially difficult obstacles to overcome.

Even worse, efforts to retract misinformation often backfire, paradoxically amplifying the effect of the erroneous belief.

“This persistence of misinformation has fairly alarming implications in a democracy because people may base decisions on information that, at some level, they know to be false,” says Lewandowsky.

“At an individual level, misinformation about health issues—for example, unwarranted fears regarding vaccinations or unwarranted trust in alternative medicine—can do a lot of damage. At a societal level, persistent misinformation about political issues (e.g., Obama’s health care reform) can create considerable harm. On a global scale, misinformation about climate change is currently delaying mitigative action.”

Though misinformation may be difficult to correct, all is not lost. According to Lewandowsky, “psychological science has the potential to counteract all those harms by educating people and communicators about the power of misinformation and how to meet it.”

In their report, Lewandowsky and colleagues offer some strategies for setting the record straight.

  • Provide people with a narrative that replaces the gap left by false information
  • Focus on the facts you want to highlight, rather than the myths
  • Make sure that the information you want people to take away is simple and brief
  • Consider your audience and the beliefs they are likely to hold
  • Strengthen your message through repetition

Research has shown that attempts at “debiasing” can be effective in the real world when based on these evidence-based strategies.

 
According to the report: Misinformation can be disseminated in a number of ways, often in the absence of any intent to mislead. For example, the timely news coverage of unfolding events is by its very nature piecemeal and requires occasional corrections of earlier statements. As a case in point, the death toll after a major natural disaster—such as the 2011 tsunami in Japan—is necessarily updated until a final estimate becomes available. Similarly, a piece of information that is considered “correct” at any given stage can later turn out to have been erroneous.

Indeed, this piecemeal approach to knowledge construction is the very essence of the scientific process, through which isolated initial findings are sometimes refuted or found not to be replicable. It is for this reason that scientific conclusions are usually made and accepted only after some form of consensus has been reached on the basis of multiple lines of converging evidence. 

 
Misinformation that arises during an evolving event or during the updating of knowledge is unavoidable as well as unintentional; however, there are other sources of misinformation that are arguably less benign. The particular sources we discuss in this article are:

Rumors and fiction. Societies have struggled with the misinformation-spreading effects of rumors for centuries, if not millennia; what is perhaps less obvious is that even works of fiction can give rise to lasting misconceptions of the facts.

Governments and politicians. Governments and politicians can be powerful sources of misinformation, whether inadvertently or by design.

Vested interests. Corporate interests have a long and well-documented history of seeking to influence public debate by promulgating incorrect information. At least on some recent occasions, such systematic campaigns have also been directed against corporate interests, by nongovernmental interest groups.

The media. Though the media are by definition seeking to inform the public, it is notable that they are particularly prone to spreading misinformation for systemic reasons that are worthy of analysis and exposure. With regard to new media, the Internet has placed immense quantities of information at our fingertips, but it has also contributed to the spread of misinformation. The growing use of social networks may foster the quick and wide dissemination of misinformation. The fractionation of the information landscape by new media is an important contributor to misinformation’s particular resilience to correction.

The report, “Misinformation and its Correction: Continued Influence and Successful Debiasing,” is published in the September issue of Psychological Science in the Public Interest and is written by Stephan Lewandowsky and Ullrich Ecker of the University of Western Australia, Colleen Seifert and Norbert Schwarz of the University of Michigan, and John Cook of the University of Queensland and the University of Western Australia.

The report also features a commentary written by Edward Maibach of George Mason University.

The full report and the accompanying commentary are available free online.

Contacts and sources:
Anna Mikulak
Association for Psychological Science

Report abuse

Comments

Your Comments
Question   Razz  Sad   Evil  Exclaim  Smile  Redface  Biggrin  Surprised  Eek   Confused   Cool  LOL   Mad   Twisted  Rolleyes   Wink  Idea  Arrow  Neutral  Cry   Mr. Green

Top Stories
Recent Stories

Register

Newsletter

Email this story
Email this story

If you really want to ban this commenter, please write down the reason:

If you really want to disable all recommended stories, click on OK button. After that, you will be redirect to your options page.