Archive for the 'science' Category

Mooney on the Hamilton Study on Climate Change Attitudes

Tuesday, May 3rd, 2011

Chris Mooney (who I suspect is working on a new book on this subject, which I can’t wait to read) has another item up today pointing to yet another study that documents the phenomenon that those on the left or right who self-identify as being more knowledgeable about the science of climate change tend to be more, rather than less, polarized in their views. That is, Democrats who see themselves as well-informed on the subject are more likely to acknowledge the actual scientific consensus, while well-informed Republicans are more likely to deny it, than those of either party who say they don’t know as much. See: Climate Change and Well-Informed Denial.

This core finding itself is not new – a 2008 Pew survey also found that Republicans with a college level of education were less likely to accept the science of climate than Republicans who lack such education. Other studies have also underscored this fundamental point. But for precisely that reason, Hamilton’s research kind of puts it in the realm of indisputable political fact. Not only are we polarized over climate change, but our knowledge and sophistication, when combined with our politics, make matters worse.

How could this be? For Hamilton, the explanation lies in the interaction between how we get information (from trusted news and Internet sources, we think, but we’re actually being selective) and our own biases in evaluating it (objectively, we think, but again, we’re actually being selective). “People increasingly choose news sources that match their own views,” Hamilton writes. “Moreover, they tend to selectively absorb information even from this biased flow, fitting it into their pre-existing beliefs.” In other words, we’re twice biased – based on our views and information sources – and moreover, twice biased in different directions.

Greg Craven’s ‘How It All Ends’ Video

Friday, April 29th, 2011

High-school science teacher Greg Craven had way too much fun making this video:

Lewindowsky on How Ideology Trumps Fact

Thursday, April 28th, 2011

In some ways Australia is ground zero in the climate change catastrophe. For whatever reasons, human-caused perturbations of climate are falling especially hard and especially early Down Under. Which means their politics are probably in some ways a predictor of what we can expect in other parts of the world as things get climatically weirder.

Anyway, I really liked this piece by Australian Stephan Lewandowsky:
The truth is out there. It’s more about American politics than Australian politics, but I still get a sense of the Australian reality seeping through:

The late Stephen Jay Gould referred to a fact as something that it would be “perverse to withhold provisional assent.” Notwithstanding the Academy’s clear statement about the existence of global warming and its human-made causes, recent surveys reveal that the majority of US Republicans do not accept this scientific fact.

Indeed, tragically and paradoxically, among Republicans acceptance of the science decreases with their level of education as well as with their self-reported knowledge: Whereas Democrats who believe they understand global warming better also are more likely to believe that it poses a threat in their lifetimes, among Republicans increased belief in understanding global warming is associated with decreased perception of its severity. The more they think they know, the more ignorant they reveal themselves to be.

Why?

What motivates people to reject trivially simple facts – such as the President’s place of birth – as well as more complex facts – such as insights from geophysics and atmospheric science?

The peer-reviewed psychological literature provides some insight into this question. Numerous studies converge onto the conclusion that there is a strong correlation between a person’s endorsement of unregulated free markets as the solution to society’s needs on the one hand, and rejection of climate science on the other. The more “fundamentalist” a person is disposed towards the free market, the more likely they are to be in denial of global warming.

But what do markets have to do with geophysics or the thermal properties of CO2?

The answer is that global warming poses a potential threat to laissez-faire business. If emissions must be cut, then markets must be regulated or at least “nudged” towards alternative sources of energy – and any possibility of regulation is considered a threat to the very essence of their worldview by those for whom the free market is humanity’s crowning achievement.

It is this deep psychological threat that in part explains the hyper-emotionality of the anti-science discourse: the frenetic alarmism about a “world government”, the rhetoric of “warmist” or “extremist” levelled at scientists who rely on the peer reviewed literature, the ready invocation of the spectre of “socialism” – they all point to the perception of threat so fundamental that even crazed beliefs can constitute an alluring antidote.

More Mooney on Scientists on Reasoning

Monday, April 25th, 2011

Chris Mooney comments on a new study by Hugo Mercier and Dan Sperber in this item: Is Reasoning Built for Winning Arguments, Rather Than Finding Truth? (Short answer: Quite possibly yes.)

Here’s the abstract of Mercier and Sperber’s paper (see Why Do Humans Reason? Arguments for an Argumentative Theory):

Reasoning is generally seen as a means to improve knowledge and make better decisions. However, much evidence shows that reasoning often leads to epistemic distortions and poor decisions. This suggests that the function of reasoning should be rethought. Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade. Reasoning so conceived is adaptive given the exceptional dependence of humans on communication and their vulnerability to misinformation. A wide range of evidence in the psychology of reasoning and decision making can be reinterpreted and better explained in the light of this hypothesis. Poor performance in standard reasoning tasks is explained by the lack of argumentative context. When the same problems are placed in a proper argumentative setting, people turn out to be skilled arguers. Skilled arguers, however, are not after the truth but after arguments supporting their views. This explains the notorious confirmation bias. This bias is apparent not only when people are actually arguing but also when they are reasoning proactively from the perspective of having to defend their opinions. Reasoning so motivated can distort evaluations and attitudes and allow erroneous beliefs to persist. Proactively used reasoning also favors decisions that are easy to justify but not necessarily better. In all these instances traditionally described as failures or flaws, reasoning does exactly what can be expected of an argumentative device: Look for arguments that support a given conclusion, and, ceteris paribus, favor conclusions for which arguments can be found.

Mercier explained it to Mooney this way:

If reasoning evolved so we can argue with others, then we should be biased in our search for arguments. In a discussion, I have little use for arguments that support your point of view or that rebut mine. Accordingly, reasoning should display a confirmation bias: it should be more likely to find arguments that support our point of view or rebut those that we oppose. Short (but emphatic) answer: it does, and very much so. The confirmation bias is one of the most robust and prevalent biases in reasoning. This is a very puzzling trait of reasoning if reasoning had a classical, Cartesian function of bettering our beliefs – especially as the confirmation bias is responsible for all sorts of mischief… Interestingly, the confirmation bias needs not be a drag on a group’s ability to argue. To the extent that it is mostly the production, and not the evaluation of arguments that is biased – and that seems to be the case – then a group of people arguing should still be able to settle on the best answer, despite the confirmation bias… As a matter of fact, the confirmation bias can then even be considered a form of division of cognitive labor: instead of all group members having to laboriously go through the pros and cons of each option, if each member is biased towards one option, she will find the pros of that options, and the cons of the others – which is much easier – and the others will do their own bit.

Mooney observes:

I think this evolutionary perspective may explain one hell of a lot. Picture us around the campfire, arguing in a group about whether we need to move the camp before winter comes on, or stay in this location a little longer. Mercier and Sperber say we’re very good at that, and that the group will do better than a lone individual at making such a decision, thanks to the process of group reasoning, where everybody’s view gets interrogated by those with differing perspectives.

But individuals – or, groups that are very like minded – may go off the rails when using reasoning. The confirmation bias, which makes us so good at seeing evidence to support our views, also leads us to ignore contrary evidence. Motivated reasoning, which lets us quickly pull together the arguments and views that support what we already believe, makes us impervious to changing our minds. And groups where everyone agrees are known to become more extreme in their views after “deliberating” – this is the problem with much of the blogosphere.

It’s interesting to me how the legal system recreates and formalizes the roles of this (hypothetical) evolutionary collective decision-making mechanism. We have the interested parties advocating on each side. We have the impartial referee (the judge) whose job it is to make sure the advocates follow a set of rules designed to ensure fairness. And then we have the jury: A group of objective, disinterested observers who evaluate the arguments of the advocates, and, free of the distortions of their own confirmation bias, decide which side is right.

This plays out really interestingly in the age of the Internet. The Internet reduces the distance between me and confirmatory evidence to zero, so I’m able to amass what appears to me to be an unassailable edifice of fact in support of whatever position I’m arguing. It also makes it very convenient for me to assemble with others who share my views (since no matter how outré those views are, the Internet reduces the distance between me and my would-be cohorts to zero as well). With our evolutionary craving for societal connection satisfied, we crazies can hive off into our own little world untroubled by the arguments of those who disagree with us.

I also like how this applies to politics. The advocates on both sides are all convinced of their own rightness, regardless of reality. The “jury” ends up being those infamous independent voters, those who don’t really care about the big issues of the day, and wake up every few years just before election time to do a quick, fairly disengaged evaluation of the arguments of each side. It’s an interesting notion that those low-information voters actually could be by far the most important players in the process. Because of our evolutionary tendency to lean on the scales of our own judgment, we are fatally biased as decision-makers on any subject about which we care deeply. We desperately need those blithe, unconcerned voters in the middle to help us reach good collective decisions.

We need civility. We need reasonable, rational middlemen (and -women). We need venues where advocates from both sides can engage with each other and lay out their arguments, but where relatively disinterested observers are also present to evaluate those arguments. We need communities that we are connected to, and feel a part of, but that nevertheless span ideological boundaries.

That used to be the way all communities worked, because they were by definition local, defined by the limits of transportation and communication to consist of the people who lived and worked in a given location. They don’t necessarily work that way anymore, which I’ve tended to think of as a good thing. But in light of the implications of Mercier and Sperber’s research, I may need to reconsider.

Mooney on Science and Its Opposite

Tuesday, April 19th, 2011

I’ve been following Chris Mooney’s blogging and podcasting really closely for a while now; he’s digging into subjects that I find fascinating, and I like his take on them.

He has an article in the upcoming issue of Mother Jones that covers some of the most interesting stuff he’s been into lately: The science of why we don’t believe science. It discusses recent research that Dan Kahan (among others) has been doing — see Cultural Cognition of Scientific Consensus (PDF) — that discusses how a person’s deeply held values influence his or her perception of scientific opinion. So, for example, if you are a politically conservative/libertarian-leaning person (in Kahan’s formulation, someone who values hierarchy over egalitarianism and individualism over communitarianism), then you will resist the (truthful) idea that there exists a scientific consensus that human activity is warming the planet, and poses a grave risk, presumably because you fear that such a consensus, if it existed, would be used to advance a government regulatory regime that would restrict free enterprise.

But the same sword cuts the other way, too. Kahan’s research shows that those on the progressive end of the political spectrum (in his formulation, those who value egalitarianism over hierarchy and communitarianism over individualism) are similarly likely to question the (truthful) idea that there exists a scientific consensus that radioactive wastes from nuclear power can be safely disposed of in deep underground storage facilities.

Mooney interviewed Kahan for his Point of Inquiry podcast a few months ago (Dan Kahan – The American Culture War of Fact), and it was a great interview; highly recommended. I was actually kind of disappointed that the article in Mother Jones didn’t go into as much detail as the podcast (see my griping here, for example). But for a broad-but-shallow overview of some intriguing aspects of the issue, it (the Mother Jones article) is definitely worth reading.

Laden on Skeptics on Fukushima

Tuesday, March 15th, 2011

Greg Laden has some really good comments on the ongoing Fukushima reactor events and their spinning by pro-/anti-nuclear advocates: The Fukushima Disaster, Hyperbole, Credibility, Skepticism, and the Future of Nuclear Power.

I honestly think that it is too early to have this conversation, but alas, the conversation has been forced.

O’Reilly: Athiest Billboards Are Insulting

Thursday, January 6th, 2011

Bill O’Reilly on how athiest billboards are an insult to believers. Also, he believes in God because of the tides. Courtesy of NorthernLight, who posted it in the comments to Climate roulette:

Novella on Bedbugs AND Meta-cognition (*Swoon*)

Friday, December 31st, 2010

Oh, man-crush Steven Novella, how do I love thy postings at Neurologica? Let me count the ways…

Um, okay: two. That is, I love the latest post at Neurologica (The Coming Bedbug Plague) two ways: It is about an insect (which is a topic I’m lately fairly obsessed with) and it links the insect story with a pithy observation about humans’ mistaken belief in the inevitability of progress.

Here’s my favorite bit from the part about progress:

My initial surprise at hearing this story, I think, reflects an inherent progressivist bias in our thinking. We tend to think of human history as making inexorable progress. This bias is reinforced, especially since the industrial revolution, by the fact that science and technology has been relentlessly progressive. The problem is in the default assumption that all change is progressive – whatever current system we have must be better than the old system because newer is better.

Human history, however, is more complex than our default assumptions. Sometimes history is regressive. And sometimes it is cyclical. Not all current trends will extrapolate indefinitely into the future. Today’s fad is not always the wave of the future.

In my mind bedbugs were a problem of pre or early industrial societies, and were no longer an issue given modern hygiene and pest-control. I associated bedbugs with an earlier age, and it just seemed incongruous that they could return in the 21st century. But the details tell a different story.

I’m not sure I’ve mentioned the recent insect obsession on lies.com, but you can find evidence of it, if you’re interested, at my local nature-y blog, Carp Without Cars. Or you can examine my recently uploaded images at Bugguide.net. Or you can watch this video I took in my bedroom the other day, of a case-bearing carpet moth caterpillar, and contemplate the fact that taking that video was kind of the high point of my week:

Or you could just take my word for it: I’m kind of into bugs lately.

Yong on the Urge to Cling Harder to Shaken Belief

Wednesday, December 29th, 2010

Awesome science blogger Ed Yong wrote back in October about a new study demonstrating the lengths to which people will go to avoid cognitive dissonance: When in doubt, shout – why shaking someone’s beliefs turns them into stronger advocates.

You don’t have to look very far for examples of people holding on to their beliefs in the face of overwhelming evidence to the contrary. Thousands still hold to the idea that vaccines cause autism, that all life was created a few thousand years ago, and even that drinking industrial bleach is a good idea. Look at comment threads across the internet and you’ll inevitably find legions of people who boldly support for these ideas in the face of any rational argument.

That might be depressing, but it’s not unexpected. In a new study, David Gal and Derek Rucker from Northwestern University have found that when people’s confidence in their beliefs is shaken, they become stronger advocates for those beliefs. The duo carried out three experiments involving issues such as animal testing, dietary preferences, and loyalty towards Macs over PCs. In each one, they subtly manipulated their subjects’ confidence and found the same thing: when faced with doubt, people shout even louder.

There are a couple of obvious tie-ins to the climate change debate: Deniers deny even more fiercely in the face of mounting scientific evidence that climate change is real, and that urgent action to address it is imperative. And I guess it cuts the other way, too, as shcb is no doubt already preparing to type in response: In the face of public relations setbacks, the climate change believers are redoubling their own efforts. If you believe that the believers are factually wrong, and that the evidence against them is legitimate, then it matches up in exactly the same way.

And accused people tend to protest their innocence, whether or not they are guilty. That doesn’t make the two cases equivalent, though. There is such a thing as actual innocence, and it makes a difference.

For more great stuff from Ed Yong, check out his NERS Review of the year Part 9 – Twists and lessons.

Novella on Anecdotes, Anomalies, and the Importance of Context

Tuesday, December 7th, 2010

Steven Novella has thought a lot about thinking. I offer in evidence the following post from his Neurologica blog: The Context of Anecdotes and Anomalies.

The problem with anecdotes is that they are subject to a host of biases, such as confirmation bias. They are easily cherry picked, even unintentionally, and therefore can be used to support just about any position. For every anecdote, there is an equal and opposite anecdote.

I really liked it, and heartily recommend the whole thing to the friendly local conspiracy theorists. Unfortunately, I also predict that they will fail to recognize it as a valid indictment of their epistemological shortcomings. Oh, well.

Staniford on Denialism, Drought, and Politics

Saturday, November 13th, 2010

My latest man-crush is Stuart Staniford. He has a PhD in physics from UC Davis and is currently chief scientist for FireEye, a company that develops security software. As a hobby, though, he obsesses about longterm risks to humanity, and in his most-recent bloggy incarnation he writes Early Warning, which I just came across last week when Kevin Drum linked to it.

I don’t always do this when I come across a new blog, but in this case I’ve gone back to the beginning and am reading the whole thing in chronological order. It’s really good stuff. I enjoy following along as Staniford works his way through a problem; he’s logical, intelligent, and has a real knack for conveying the technical details of an issue in terms that neither overwhelm nor talk down to a non-technical reader.

Here’s one item I liked: The Elephant in the Room, in which he reviews the book of the same name. Staniford actually doesn’t think much of the book, but his comments on the subject itself are very cool:

When something is scary, people have an incentive to somehow avoid dealing with the facts, and a variety of creative strategies are available to them.

And in the alternative, if you commit yourself in some way to the idea that a particular risk is a big deal, (eg taking a public position, making career choices based on your assessment), you have a psychological incentive to deny evidence that maybe the problem is not so severe after all.

I think it’s these dueling incentives that create the structure we so often see around major global risks – one side is busy either ignoring the problem, or if that is no longer working, minimizing it, attacking the integrity of the proponents, etc. Meanwhile, the other side is at risk of exaggerating the seriousness of the problem, ignoring countervailing evidence or important context and of course attacking the integrity of the deniers. Both sides are often sincerely convinced of their own rightness (though there certainly can be scope for cynicism and deliberate dishonesty as well, and both sides will be very quick to point to the evidence for this on the other side, and very slow to examine it on their own side).

Staniford has a recent series of posts on the future of drought that were especially good. They basically concern his emotional and intellectual reactions to this graph:

It’s fascinating stuff. Staniford basically freaks out in response to seeing the graph and trying to wrap his head around what it means, then settles himself down and says, in effect, hey, I’m a scientist. I need to engage with this thing rationally. And then he does just that, and takes the reader along with him as he does his research and fits the pieces together. Highly recommended.

If a seven-part series of posts is too much of an investment, here’s a recent piece of Staniford’s that might pique the interest of this site’s readers in particular: A few election throughts.

What I see happening is this: the public is aware, rather inchoately, that things are going badly wrong and that the life they are accustomed to is under threat, but they have no idea what to do. The parties, by and large, have failed to diagnose the roots of the problem, and instead are reflexively proposing to relive their greatest hits of the past. Since the problems of the past are not the problems of the present, these approaches are not working. This is leading both parties into a cycle of over-promising what they can deliver, thus leading to bitter disappointment.

He goes on to detail just how it is that he sees the two major parties failing, and he doesn’t pull his punches. Like Jon Stewart talking to Rachel Maddow, the thing that strikes me the most, I think, is just how refreshing it is to read the take of someone who doesn’t feel the need to be throwing monkey poo at one side or the other, but is willing to stand back and say look: monkeys throwing poo.

Stanisford on Science (the Magazine)’s Climate Alarmism

Wednesday, November 10th, 2010

I’m posting this item mostly because I know it will be like catnip for shcb. Enjoy! From Stuart Staniford of the Early Warning blog: Climate Alarmism at Science Magazine?

When Scientists Actually Do Fabricate Data

Saturday, August 28th, 2010

In light of recent discussions we’ve been having about alleged bogus science, I thought this story was interesting. It concerns Dr. Marc Hauser, a “star researcher” from Harvard who is an expert on animal and human cognition, and who has written on the evolutionary basis of morality. It also appears, though, that he may have intentionally fudged research data in order to arrive at a predetermined result: Marc Hauser May Have Fabricated Data at Harvard Lab.

Some forms of scientific error, like poor record keeping or even mistaken results, are forgivable, but fabrication of data, if such a charge were to be proved against Dr. Hauser, is usually followed by expulsion from the scientific community.

“There is a difference between breaking the rules and breaking the most sacred of all rules,” said Jonathan Haidt, a moral psychologist at the University of Virginia. The failure to have performed a reported control experiment would be “a very serious and perhaps unforgivable offense,” Dr. Haidt said.

Makes for an interesting contrast, doesn’t it? You could compare it, say, to the East Anglia Climate Research Unit, where allegations of misdeeds following the theft and selective release of emails led to three independent investigations, all of which found that researchers acted with honesty and integrity, and that their results were scientifically valid.

Toles on Global Warming Denialism

Friday, August 13th, 2010

What Tom Toles said: Election digest.

If you can’t accept the conclusions of 98 percent of the scientists whose FIELD IT IS, then why even bother with science? If that high a percentage of field of study is to be discounted ENTIRELY, then we are in deep trouble, which, of course, we are. It would be so simple if it were just a matter of ignoring the yelping commenters hereabouts: “Move on, Mr. Cartoonist! Chill out Tommy! There are more important things to worry about!”

Really? Which would those things be? This may be the only political issue whose results could be catastrophic PERMANENTLY. But the deliberate dust storm thrown up by fossil-fuel-centric interests has succeeded in contaminating and paralyzing the American response. Quite a victory for the deniers! It looks like mass-suicide to me.

I Wish Global Warming Was a Hoax. Unfortunately, It’s Not.

Saturday, July 31st, 2010

I noticed in the comments to the previous item that shcb thinks I’m showing close-mindedness (or something) by virtue of my resistance to the evidence that human-caused climate change is a hoax.

Sigh.

Coyne on the “Tom Johnson” Sock Puppet

Tuesday, July 27th, 2010

From Jerry Coyne, author of the book Why Evolution Is True, comes this interesting (if you’re into high-profile falsehood, at least) account of an apparent act of sock puppetry aimed at questioning the “New Atheist” approach to confronting religious believers: On the uncivility of atheists: “Tom Johnson” and Exhibit A .

MacKenzie on Denialism

Friday, June 4th, 2010

Writing in New Scientist, Debora MacKenzie has an article that is right up my alley: Living in denial: Why sensible people reject the truth.

All denialisms appear to be attempts like this to regain a sense of agency over uncaring nature: blaming autism on vaccines rather than an unknown natural cause, insisting that humans were made by divine plan, rejecting the idea that actions we thought were okay, such as smoking and burning coal, have turned out to be dangerous.

This is not necessarily malicious, or even explicitly anti-science. Indeed, the alternative explanations are usually portrayed as scientific. Nor is it willfully dishonest. It only requires people to think the way most people do: in terms of anecdote, emotion and cognitive short cuts. Denialist explanations may be couched in sciency language, but they rest on anecdotal evidence and the emotional appeal of regaining control.

Connecticut Principal: Gifted 10-year-olds Can’t Handle the Truth about Darwin

Friday, April 16th, 2010

Public school students should be taught science in their science classes. I think that’s particularly true in the case of gifted children, from whose ranks we can expect the next generation of scientists to emerge. So it’s troubling when a principal more attuned to avoiding confrontations with Christian fundamentalist parents than with serving the educational needs of his students tells a teacher he can’t teach Darwin, for fear it might offend.

Still, I realize that that sort of thing happens. It happens in places like Texas and Arkansas. But thanks to Steven Novella writing at Neurologica, I’ve learned that it also happens in places like Connecticut: Mark Tangarone: Weston TAG teacher leaves over evolution flap.

The whole story is pretty interesting. But the crucial piece of evidence for cutting through the he-said/he-said of the Weston schools superintendent (who says this is a “personnel matter” involving a “disgruntled employee” and has nothing to do with the teaching of evolution) and the teacher who is now resigning (who says he’s leaving because he was ordered to eliminate the teaching of Darwin’s work from his gifted students’ science curriculum), is an email that the teacher received in late 2008 from his then-principal, which reads as follows:

While evolution is a robust scientific theory, it is a philosophically unsatisfactory explanation for the diversity of life. I could anticipate that a number of our parents might object to this topic as part of a TAG project, and further, parents who would object if evolution was part of a presentation by a student to students who do not participate in the TAG program.

Evolution touches on a core belief — Do we share common ancestry with other living organisms? What does it mean to be a human being? I don’t believe that this core belief is one in which you want to debate with children or their parents, and I know personally that I would be challenged in leading a 10-year-old through this sort of discussion while maintaining the appropriate sensitivity to a family’s religious beliefs or traditions.

In short, evolution is a topic that is not age appropriate, is not part of our existing curriculum, is not part of the state frameworks at this point in a student’s education, nor a topic in which you have particular expertise. For all of these reasons, the TAG topics need to be altered this year to eliminate the teaching of Darwin’s work and the theory of evolution.

Oy. The principal who sent that email, Mark Ribbens, apparently has a PhD, since he’s referred to as “Dr.” Ribbens in the news article. I wonder what his doctorate was in.

Update: I was wrong; on further investigation he turns out to have an Ed.D. Not that I’m saying there’s something wrong with that. But I should have realized.

He’s still a principal, but now he’s principal for a part-time performing arts magnet school. I’m thinking that might be a better pedagogical niche for him than supervising the science education of gifted middle schoolers. Anyway, if you’d like to share your philosophical dissatisfaction with Dr. Ribbens’ views on the teaching of evolution, you can reach him at ribbensm@ces.k12.ct.us, or call him at (203) 365-8851.

I just emailed him as follows (with a CC: to the head of his current school’s parents association, on the theory that Ribbens might be more sensitive to her views than to mine):

Dr. Ribbens,

I was disturbed to read the article in the Weston Forum quoting your email to Mark Tangarone from a few years ago, in which you forbade him to teach evolution to students in the TAG program at Weston Intermediate. See:

http://www.acorn-online.com/joomla15/thewestonforum/news/local/55349-mark-tangarone-tag-teacher-leaves-over-evolution-flap.html

I realize that there probably are (at least) two sides to this matter, and that your email may have been misquoted or taken out of context. But if the quoted statements are accurate, then I encourage you to be more careful in the future when deciding that the religious sensitivities of a subset of parents are sufficient reason to prevent your students from receiving age-appropriate science instruction.

Particularly troubling to me was the following line from your email: “While evolution is a robust scientific theory, it is a philosophically unsatisfactory explanation for the diversity of life.” I’m curious what you mean by that. In what sense do you believe the theory of evolution to be philosophically unsatisfying?

The question of whether or not we share common ancestry with other living organisms is not just “a core belief,” as you describe it in your email. It is a fact, one that has been established as thoroughly as it is possible for scientific investigation to establish such things. It forms the conceptual basis of the bulk of modern medical and biological science. To intervene with a science teacher to prevent middle school students from learning that fact strikes me as profoundly misguided.

I realize that your current position as principal of a part-time performing arts magnet school limits your influence on curriculum decisions regarding science, and speaking frankly as the parent of a school-aged child, I think that’s probably for the best. But I hope you will think about this issue more carefully should you find yourself in a position to make similar decisions in the future.

Thanks.

John Callender
jbc@jbcsystems.com

So there it is: My “someone is wrong on the Internet!” moment for the day.

Steven Novella on Hyperactive Agency Detection

Monday, March 22nd, 2010

Remember my new man-crush, neuroscientist and skeptic Steven Novella? He’s got a great item on his NeuroLogica blog today: Hyperactive Agency Detection.

When HADD is triggered and we think we see the hidden agent, it speaks to us in a very primal way. For some people the perception of hidden agency becomes overwhelming, dominating all other thought processes. We know these people as conspiracy theorists. But there is a little conspiracy theorist inside each of us.

He talks about the evolutionary underpinnings of humans’ tendency to partition the world into two classes of entities — agents and objects — and the possible role this may play in our collective tendency toward conspiracy theories and religion.

In Which I Confess to the Onset of a Sudden Man Crush on Steven Novella, MD

Saturday, February 27th, 2010

Barbara Tomlinson pointed out to me that Phil Plait (of the Bad Astronomy blog) linked to me today in Two posts about denialism, climate change and otherwise. That was pretty cool, given how much I like the Bad Astronomy blog. The post reads a bit like Plait is crediting me with having come up with the O.J. Simpson/climate-change denialism metaphor, when of course it was Bill McKibbon who did that; I just quoted from and linked to him. But I’m not proud; I’ll take the traffic, and will secretly cherish the thought that Phil Plait linked to Lies.com! Yay!

Even better: By linking to me he made sure I’d pay extra-close attention to the post in which he did so, thereby bringing my attention to someone I’ve inexplicably never read before: Steven Novella, MD, of the NeuroLogica Blog.

Wow. Just wow. This is serious pay dirt.

Here’s the NeuroLogica post that Plait thought my McKibbon theft was worth sharing a post with: Scientific consensus, climate change, and vaccines. After quoting Robert Kennedy Jr. on the strength of the scientific consensus on global warming, Novella continues like this:

But Robert Kennedy is not always a fan of the scientific consensus – for example he rejects the scientific consensus on vaccines, choosing to believe that the consensus is a deliberate fraud (exactly what global warming dissidents say about the climate change consensus). This makes Robert Kennedy a hypocrite – he accepts the scientific consensus and cites its authority when it suits his politics, and then blithely rejects it (spinning absurd conspiracy theories that would make Jesse Ventura blush) when it is inconvenient to his politics.

But Kennedy is not alone – this seems to be what most people do most of the time. In fact I would argue that we need to be especially suspicious of our scientific opinions on controversial topics when they conform to our personal ideology (whether political, social, or religious). That is when we need to step back and ask hard questions that challenge the views we want to hold. We also need to make sure that our process is consistent across questions – are we citing the scientific consensus on one issue and rejecting it on another? Are we citing conflicts of interest for researchers whose conclusions we don’t like, and ignoring them for researchers whose conclusions confirm our beliefs?

Did I already say wow?

Here’s another item from Novella: Letters from a 9/11 conspiracy theorist. I’m going to let you go read that yourself. And I think you know who I mean by you. :-)

Finally, Novella is the host and producer of a weekly science podcast, The Skeptics’ Guide to the Universe, that is currently on its 240th episode. Looks like my commute just got booked solid.