This column by Monbiot on the problems with public perception of science is excellent and worth reading in its entirety, but I found this particularly interesting as it confirms something I’ve suspected about ideology and belief:
In 2008 the Washington Post summarised recent psychological research on misinformation. This shows that in some cases debunking a false story can increase the number of people who believe it. In one study, 34% of conservatives who were told about the Bush government’s claims that Iraq had weapons of mass destruction were inclined to believe them. But among those who were shown that the government’s claims were later comprehensively refuted by the Duelfer report, 64% ended up believing that Iraq had weapons of mass destruction.
There’s a possible explanation in an article published by Nature in January. It shows that people tend to “take their cue about what they should feel, and hence believe, from the cheers and boos of the home crowd”. Those who see themselves as individualists and those who respect authority, for instance, “tend to dismiss evidence of environmental risks, because the widespread acceptance of such evidence would lead to restrictions on commerce and industry, activities they admire”. Those with more egalitarian values are “more inclined to believe that such activities pose unacceptable risks and should be restricted”.
These divisions, researchers have found, are better at explaining different responses to information than any other factor. Our ideological filters encourage us to interpret new evidence in ways that reinforce our beliefs. “As a result, groups with opposing values often become more polarised, not less, when exposed to scientifically sound information.” The conservatives in the Iraq experiment might have reacted against something they associated with the Duelfer report, rather than the information it contained.
Guardian: The trouble with trusting complex science
See also:
March 31, 2010 at 10:11 pm
I do hope people will read the excerpt above and think about it without putting too much emphasis on the example of global warming. Because it goes both ways.
Those of us who believe in it aren’t necessarily believing it because of the evidence in and of itself, we’re believing it because of how our ideology has primed us to respond to the evidence.
That doesn’t mean that neither camp is right or that both camps is right, or whatever. One side is right, the other is wrong. But being right or wrong may have nothing to do with rationality, education, or anything like that. It may come down entirely to ideology.