Risk perception, political views, and understanding of…

Risk perception, political views, and understanding of science

Most of the time people’s political views don’t stop them from appreciating and taking advantage of scientific knowledge. In isolated cases, though, public policy questions become tangled up with group identity in a way that interferes with that process, which kind of sucks.

These graphs illustrate how that works. More after the cut.

The graphs are from “Climate-Science Communication and the Measurement Problem” (link is to the full paper; also summarized briefly here), by Dan Kahan of the Yale Cultural Cognition Project.

They’re based on a survey of a representative sample of 2,000 US residents. For all 10 graphs, the left axis measures respondents’ perception of risk, from no perceived risk at the bottom to extremely high risk at the top.

In the lefthand graphs, risk perception is plotted against political ideology, with a regression line showing how strongly the two variables are correlated.

In the righthand graphs, risk perception is plotted against something called “Ordinary Science Intelligence” (OSI), which is a measurement of general scientific knowledge, quantitative reasoning skills, and the ability to reassess preexisting belief based on available information. Plots are shown for two groups: those more ideologically liberal than average (the blue bar) and those more ideologically conservative than average (the red bar).

From the graphs you can see that people of different political beliefs have dramatically different views on the risk of global warming and private gun ownership (as well as fracking, which is covered in the paper but which I didn’t include here, and underground storage of nuclear waste, which isn’t covered in the paper but which other work by Kahan shows has an opposite polarity, in which it is the more liberal members of the population whose ideology predicts views that diverge from science).

Interestingly, for these specific topics, knowing more about science (as measured by OSI) doesn’t make people better at conforming their views to those of scientists. Instead, it makes them better at conforming their views to those of their cultural group. If you look at the righthand graphs for these polarized topics, the distance between the liberal and conservative positions increases with OSI.

But those topics are anomalies. Advocates on both sides have infused these topics with antagonistic cultural meanings, which makes it so ordinary citizens have to choose between recognizing what is known to science and being who they are. Faced with that choice, most people choose the position that protects their group identity. As Kahan explains, it’s rational for them to do so — because in that situation, recognizing and expressing the view of scientists would be personally costly in ways that expressing the group view (while being wrong on the science) isn’t.

For the other three topics shown in the graphs (radio waves from cell phones, genetically modified food, and childhood vaccination), there is little or none of this ideological entanglement. In each of those cases the public is good at identifying and benefiting from scientific knowledge, and membership in left or right political ideologies doesn’t interfere with that process.

This is a good thing. When people can accurately assess what science says about risks, they make better decisions. The antagonistic cultural meanings that interfere with that process are, in Kahan’s view, a form of pollution — pollution of the science communication environment.

This is why it concerns me to see people squaring off across the ideological divide over the childhood vaccination issue. Until now, childhood vaccination has been pretty uncontroversial. True, there were a few (a very few) people who engaged in misguided advocacy, but only a small number of parents had concerns that caused them to delay or skip vaccination. And while those numbers have increased, they’re still small.

Even a small number of parents who resist vaccinating is problematic. But vilifying those parents and pushing for compulsory vaccination is a really bad idea, because it will lead to more polarization and and an increase in antagonistic cultural meanings around the issue. Similarly, when politicians publicly stake out a position of citing scientific uncertainty and defending parent’s “right to decide”, they are stoking antagonistic cultural meanings in a way that will inevitably lead to lower vaccination rates and more preventable disease. That’s a shitty thing to do.

Don’t pollute the science communication environment.

Reposted from http://ift.tt/16E84xT.

Tags: science communication, dan kahan, cultural cognition, vaccination, global warming.

Leave a Reply

You must be logged in to post a comment.