Democratic Underground Latest Greatest Lobby Journals Search Options Help Login
Google

Researchers find that facts can often backfire, adding strength to misinformation

Printer-friendly format Printer-friendly format
Printer-friendly format Email this thread to a friend
Printer-friendly format Bookmark this thread
This topic is archived.
Home » Discuss » Archives » General Discussion (1/22-2007 thru 12/14/2010) Donate to DU
 
guruoo Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jul-12-10 09:19 AM
Original message
Researchers find that facts can often backfire, adding strength to misinformation
Edited on Mon Jul-12-10 09:23 AM by guruoo
How facts backfire
Researchers discover a surprising threat to democracy: our brains

By Joe Keohane | July 11, 2010

...

In the end, truth will out. Won’t it?
Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

...

But researchers are working on it. One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.

There are also some cases where directness works. Kuklinski’s welfare study suggested that people will actually update their beliefs if you hit them “between the eyes” with bluntly presented, objective facts that contradict their preconceived ideas. He asked one group of participants what percentage of its budget they believed the federal government spent on welfare, and what percentage they believed the government should spend. Another group was given the same questions, but the second group was immediately told the correct percentage the government spends on welfare (1 percent). They were then asked, with that in mind, what the government should spend. Regardless of how wrong they had been before receiving the information, the second group indeed adjusted their answer to reflect the correct fact.

...

http://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/
Printer Friendly | Permalink |  | Top
slackmaster Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jul-12-10 09:23 AM
Response to Original message
1. Great stuff. Highly recommended.
It spells out phenomena that I see almost daily.
Printer Friendly | Permalink |  | Top
 
ZombieHorde Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jul-12-10 09:25 AM
Response to Original message
2. Strongly recommended. This is something we should know about ourselves.
We can see this in action here on DU.
Printer Friendly | Permalink |  | Top
 
JoePhilly Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Jul-12-10 09:52 AM
Response to Original message
3. There is a great deal of research on these cognitive effects ...
The article describes potential reasons for people accepting information that confirms their existing beliefs, and tending to reject that which refutes existing beliefs.

One simple explanation is based on two simple forms of human learning ... assimilation versus accommodation. As we develop models of the world, we tend to assimilate confirming information, while having to accommodate dis-confirming information.

Here's a simple example ...

Say you have a very young child in a car and they look at the window and they see a dog. They've seen dogs before. And so the child points at the dog and says "dog". You congratulate them. They feel good.

Then as you drive, the child sees a cow. They've never seen a cow before, so they point and say "dog", you say "npo, cow". The child is a bit confused. It looks like a dog.

As you drive, they see another cow. They point and say "cow". You congratulate them. They feel good.

As you drive, they see a horse. They've never seen a horse before. They point and say "cow". You say "no, horse." The child is again confused. It looks like a cow.

The process of adding a confirming it to the model is easier than having to expand your model so that it includes the "new thing".

Sadly, people are lazy, and so they are more likely to assimilate new knowledge into an existing framework than they are to modify their existing model to incorporate new knowledge.

The other part I found interesting has to do with the role of reputation. A key maneuver to discredit an expert, is to make them seem "mean" or "condescending to a non-expert who is arguing an alternate point. Make the expert seem like a bully.

This is classic Fox News behavior. "The MSM is trying to trick you ... but we at Fox would never do that" ... or better ... the experts in the MSM just wants to destroy poor Sarah Palin. They are powerful, she's just a regular mom who loves America.

One aspect the article does not touch has to do with one's world view. Folks who see the world as "black and white" are LESS willing to modify their existing models because that adds to the complexity and creates "exceptions", kind of like penguins or turkeys, birds that can't fly. Folks who are open to new ideas tend to be more willing to EXPAND their models to bring in alternative perspectives, or exceptions that fall outside the strict parameters of the existing model ... they also tend to to be willing to hold a knew piece of info in "limbo" ... where as the more conservative approach (its black and white)has to either add the new item quick, or discard it.
Printer Friendly | Permalink |  | Top
 
DU AdBot (1000+ posts) Click to send private message to this author Click to view 
this author's profile Click to add 
this author to your buddy list Click to add 
this author to your Ignore list Sun May 05th 2024, 01:01 AM
Response to Original message
Advertisements [?]
 Top

Home » Discuss » Archives » General Discussion (1/22-2007 thru 12/14/2010) Donate to DU

Powered by DCForum+ Version 1.1 Copyright 1997-2002 DCScripts.com
Software has been extensively modified by the DU administrators


Important Notices: By participating on this discussion board, visitors agree to abide by the rules outlined on our Rules page. Messages posted on the Democratic Underground Discussion Forums are the opinions of the individuals who post them, and do not necessarily represent the opinions of Democratic Underground, LLC.

Home  |  Discussion Forums  |  Journals |  Store  |  Donate

About DU  |  Contact Us  |  Privacy Policy

Got a message for Democratic Underground? Click here to send us a message.

© 2001 - 2011 Democratic Underground, LLC