Looking back
Stirring the pot

Correcting false rumors

Rumors  If the Internet makes it easy to spread rumors, it also makes it relatively easy to check their veracity.  

All you have to do is go to sites like Snopes.com or FactCheck.org.

So why worry?

Surely no one believes a rumor after it's been disproved.   Here are a couple of examples that show how hard it is to correct misinformation.

On the liberal side:  

When John Roberts was nominated to the Supreme Court in 1995, a Right To Choose group ran a television ad attacking him for ruling in favor of a man on trial for bombing an abortion clinic. 

The ad was false.  The case in which Roberts ruled had nothing to do with bombing; it was about legality of blockades.  Roberts ruled that abortion clinics couldn't use an 1871 federal anti-discrimination statute against anti-abortion protesters who tried to blockade clinics because such blockades were already illegal under state law.  Eventually a 6-3 majority of the Supreme Court agreed with him. 

When the group that created the ad realized this, they withdrew it and admitted that it was inaccurate.  What’s interesting is how it affected attitudes of people whose mind was already made up about Roberts.  

Before the ad ran, 56 percent of Democrats opposed Roberts' nomination.  After the ad, their opposition increased to 80 percent.  But when the ad was withdrawn and repudiated by the very group that ran it, their opposition declined only to 70 percent, 25 percent higher than before the ad ran.

On the conservative side

 

Political scientists Brendan Nyhan and Jason Reifler provided two groups of volunteers with the Bush administration's prewar claims that Iraq had weapons of mass destruction. 


One group was given a refutation -- the comprehensive 2004 Duelfer report that concluded that Iraq did not have weapons of mass destruction before the United States invaded in 2003. 


Thirty-four percent of the group given only the pre-war claims thought Iraq had hidden or destroyed its weapons before the U.S. invasion, but 64 percent  who heard both claim and refutation thought that Iraq had the weapons. Correcting the misinformation  increased pre-existing beliefs by 88 percent!  


The refutation, in other words, made the misinformation worse. 

 

In the Roberts case, the emotional content of the initial misinformation persisted long after the rational content had been corrected.  In the second case, it seems that some people actually harden their position when presented with information they consider contradictory.  


In other words, people simply believe what they want to believe. Political operatives had discovered phenomenon long before social scientists.  It's another way, we get culled into camps of "us" and "them."




Comments

The comments to this entry are closed.