PDA

View Full Version : How facts backfire



Qtec
07-20-2010, 05:02 AM
<div class="ubbcode-block"><div class="ubbcode-header">Quote:</div><div class="ubbcode-body"><span style='font-size: 14pt'>How facts backfire
Researchers discover a surprising threat to democracy: our brains</span>
By Joe Keohane

July 11, 2010

It’s one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. “Whenever the people are well-informed, they can be trusted with their own government,” Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble political pamphlets to presidential debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.


In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote </div></div>

link (http://www.boston.com/bostonglobe/ideas/articles/2010/07/11/how_facts_backfire/)


Q

LWW
07-20-2010, 05:15 AM
I have never seen an article in my life that better explained the psychology of the Obama supporter.

Thanks for sharing.

LWW

Stretch
07-20-2010, 10:26 AM
<div class="ubbcode-block"><div class="ubbcode-header">Originally Posted By: LWW</div><div class="ubbcode-body">I have never seen an article in my life that better explained the psychology of the Obama supporter.

Thanks for sharing.

LWW </div></div>

.....and so accurately reflects your state of mind. Feel free to "backfire" your reply, just to add credence. St.

LWW
07-20-2010, 12:04 PM
Thabnks for proving me right.

Here's how it works around here ... I present facts, Q and G present gibberish and dent reality. They they post data they didn't read ... I read it and show how silly they were to jump to a wrong conclusion, and then they argue that their own source is wrong.

Every leftist here believes in "TRUTH" being an entirely malleable concept and defined only by the party.

LWW

Stretch
07-20-2010, 12:46 PM
<div class="ubbcode-block"><div class="ubbcode-header">Originally Posted By: LWW</div><div class="ubbcode-body">Thabnks for proving me right.

Here's how it works around here ... I present facts, Q and G present gibberish and dent reality. They they post data they didn't read ... I read it and show how silly they were to jump to a wrong conclusion, and then they argue that their own source is wrong.

Every leftist here believes in "TRUTH" being an entirely malleable concept and defined only by the party.

LWW </div></div>

Thanks for the backfire. Predictable as usuall, dead wrong as always, caprecious drivel as is your brand. And THAT"S how it workds around here Heir Lar. lol St.

Sid_Vicious
07-20-2010, 02:35 PM
This excerpt from your post really says it all, and it is a sad feature of the future welfare of this country! "Heads in the sand or up the arse fits for these people." Oh yea, un-American too. To avoid truth, is promoting the cause of terrorists. You become part of the problem, and an active one since you fester other's minds on public internet sites. You can't really be as stupid as you come across when truth, undeniable, documented truth is stated, and you choose to aid evil by nonsense actions...sid

"researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger."