This column by Monbiot on the problems with public perception of science is excellent and worth reading in its entirety, but I found this particularly interesting as it confirms something I’ve suspected about ideology and belief:
In 2008 the Washington Post summarised recent psychological research on misinformation. This shows that in some cases debunking a false story can increase the number of people who believe it. In one study, 34% of conservatives who were told about the Bush government’s claims that Iraq had weapons of mass destruction were inclined to believe them. But among those who were shown that the government’s claims were later comprehensively refuted by the Duelfer report, 64% ended up believing that Iraq had weapons of mass destruction.
There’s a possible explanation in an article published by Nature in January. It shows that people tend to “take their cue about what they should feel, and hence believe, from the cheers and boos of the home crowd”. Those who see themselves as individualists and those who respect authority, for instance, “tend to dismiss evidence of environmental risks, because the widespread acceptance of such evidence would lead to restrictions on commerce and industry, activities they admire”. Those with more egalitarian values are “more inclined to believe that such activities pose unacceptable risks and should be restricted”.
These divisions, researchers have found, are better at explaining different responses to information than any other factor. Our ideological filters encourage us to interpret new evidence in ways that reinforce our beliefs. “As a result, groups with opposing values often become more polarised, not less, when exposed to scientifically sound information.” The conservatives in the Iraq experiment might have reacted against something they associated with the Duelfer report, rather than the information it contained.
Guardian: The trouble with trusting complex science