The fourth and final video in my series on sea level rise in…

Sunday, August 13th, 2017

The fourth and final video in my series on sea level rise in Carpinteria.

If you want to watch all four videos from the beginning, start here.

Reposted from

I made another video about sea level rise in Carpinteria. This…

Monday, May 22nd, 2017

I made another video about sea level rise in Carpinteria. This one looks at what we can do about it.

Reposted from

I made a video (the first in a series of videos, hopefully)…

Monday, March 6th, 2017

I made a video (the first in a series of videos, hopefully) about sea level rise in Carpinteria, where I live. This one is about vulnerability.

Warning: Includes my face. 😜

Reposted from

Risk perception, political views, and understanding of…

Saturday, February 7th, 2015

Risk perception, political views, and understanding of science

Most of the time people’s political views don’t stop them from appreciating and taking advantage of scientific knowledge. In isolated cases, though, public policy questions become tangled up with group identity in a way that interferes with that process, which kind of sucks.

These graphs illustrate how that works. More after the cut.

The graphs are from “Climate-Science Communication and the Measurement Problem” (link is to the full paper; also summarized briefly here), by Dan Kahan of the Yale Cultural Cognition Project.

They’re based on a survey of a representative sample of 2,000 US residents. For all 10 graphs, the left axis measures respondents’ perception of risk, from no perceived risk at the bottom to extremely high risk at the top.

In the lefthand graphs, risk perception is plotted against political ideology, with a regression line showing how strongly the two variables are correlated.

In the righthand graphs, risk perception is plotted against something called “Ordinary Science Intelligence” (OSI), which is a measurement of general scientific knowledge, quantitative reasoning skills, and the ability to reassess preexisting belief based on available information. Plots are shown for two groups: those more ideologically liberal than average (the blue bar) and those more ideologically conservative than average (the red bar).

From the graphs you can see that people of different political beliefs have dramatically different views on the risk of global warming and private gun ownership (as well as fracking, which is covered in the paper but which I didn’t include here, and underground storage of nuclear waste, which isn’t covered in the paper but which other work by Kahan shows has an opposite polarity, in which it is the more liberal members of the population whose ideology predicts views that diverge from science).

Interestingly, for these specific topics, knowing more about science (as measured by OSI) doesn’t make people better at conforming their views to those of scientists. Instead, it makes them better at conforming their views to those of their cultural group. If you look at the righthand graphs for these polarized topics, the distance between the liberal and conservative positions increases with OSI.

But those topics are anomalies. Advocates on both sides have infused these topics with antagonistic cultural meanings, which makes it so ordinary citizens have to choose between recognizing what is known to science and being who they are. Faced with that choice, most people choose the position that protects their group identity. As Kahan explains, it’s rational for them to do so — because in that situation, recognizing and expressing the view of scientists would be personally costly in ways that expressing the group view (while being wrong on the science) isn’t.

For the other three topics shown in the graphs (radio waves from cell phones, genetically modified food, and childhood vaccination), there is little or none of this ideological entanglement. In each of those cases the public is good at identifying and benefiting from scientific knowledge, and membership in left or right political ideologies doesn’t interfere with that process.

This is a good thing. When people can accurately assess what science says about risks, they make better decisions. The antagonistic cultural meanings that interfere with that process are, in Kahan’s view, a form of pollution — pollution of the science communication environment.

This is why it concerns me to see people squaring off across the ideological divide over the childhood vaccination issue. Until now, childhood vaccination has been pretty uncontroversial. True, there were a few (a very few) people who engaged in misguided advocacy, but only a small number of parents had concerns that caused them to delay or skip vaccination. And while those numbers have increased, they’re still small.

Even a small number of parents who resist vaccinating is problematic. But vilifying those parents and pushing for compulsory vaccination is a really bad idea, because it will lead to more polarization and and an increase in antagonistic cultural meanings around the issue. Similarly, when politicians publicly stake out a position of citing scientific uncertainty and defending parent’s “right to decide”, they are stoking antagonistic cultural meanings in a way that will inevitably lead to lower vaccination rates and more preventable disease. That’s a shitty thing to do.

Don’t pollute the science communication environment.

Reposted from

These 5 charts show why the world is still failing on climate change – Vox

Friday, June 20th, 2014

These 5 charts show why the world is still failing on climate change – Vox:

There’s (rightly) a lot of talk these days about climate-change denialism on the right. It’s problematic that so many people are letting their ideology color their perceptions in a way that causes them to minimize the problem.

But it’s not just Fox News viewers who are letting what they want to believe get in the way of understanding the true nature of what’s happening. I wish more people working to address the issue of climate change were honest enough to pay attention to Roger Pielke, Jr.

“Oh, we don’t like what that man is saying. It is challenging to my worldview. Therefore I will denigrate and dismiss him.”

Sigh. Our public discourse around this issue is badly polluted by tribalistic cultural-identity concerns, and not just on one side.

Reposted from

David Roberts’ Heresy on Climate Change

Tuesday, January 27th, 2009

David Roberts has some very insightful things to say about the next steps in the battle over climate change: Heresy of the day: More science is not the answer.

It pains many geeky progressives to realize it, but science is largely beside the point here. It informs the strategy, but it is not itself a strategy. The relevant realm is sociopolitical, and so the strategy must be values-based, rhetorically savvy, and emotionally resonant. Repeating the facts won’t help.

The actual detail of his argument is right on. It’s not about “raising awareness” about what science has discovered. People are “aware” already, to the extent they’re willing to be. They need to be engaged not with more science per se, but with more detail on just how they stand to be affected by climate change, and how they will benefit from taking action.