Archive for the 'Epistemology' Category

Martin: Republicans on Their Closed Media Universe

Monday, November 12th, 2012

I think Politico’s Jonathan Martin must have been told, “write a piece in which prominent Republicans find as many possible ways to say ‘epistemic closure’ without ever actually using that phrase.” The result is pretty impressive: The GOP’s media cocoon.

Bring on the thesaurus:

  • “a political-media coccoon that has become intellectually suffocating and self-defeating”
  • Pauline Kaelism
  • “the hermetically sealed bubble — except it’s not confined to geography but rather a self-selected media universe in which only their own views are reinforced and an alternate reality is reflected”
  • “‘an era of on-demand reality‘”
  • “‘We have become what the left was in the ’70s — insular.'”
  • “…this reassuring pocket universe
  • “Like a political version of ‘Thelma and Louise,’ some far-right conservatives are in such denial that they’d just as soon keep on driving off the cliff than face up to a reality they’d rather not confront”
  • “the choose-your-own-adventure news world
  • “‘Social media has made it easier to self-select…a universe… that is wedded to its own self-fulfilling prophecies‘”
  • “‘Unfortunately, for us Republicans who want to rebuild this party, the echo chamber [now] is louder and more difficult to overcome'”

The article goes on to talk about the market forces that create and sustain this hermetically sealed information space, and how Republicans concerned with winning future elections might work to transcend it.

More Good Stuff on Mercier and Sperber’s Argumentative Reasoning Theory

Wednesday, May 4th, 2011

You folks in the comments should stop arguing for a minute and check this out. Jonah Lehrer’s latest Wired Science column has more reaction to that very cool recent study by Mercier and Sperber on how confirmation bias can be explained as an evolutionary adaptation to the particular needs of reaching good decisions in a group context: The reason we reason. (With professional basketball content, too!)

It includes a link to an even cooler interview with Hugo Mercier from Edge.org: The argumentative theory: A conversation with Hugo Mercier.

Psychologists have shown that people have a very, very strong, robust confirmation bias. What this means is that when they have an idea, and they start to reason about that idea, they are going to mostly find arguments for their own idea. They’re going to come up with reasons why they’re right, they’re going to come up with justifications for their decisions. They’re not going to challenge themselves.

And the problem with the confirmation bias is that it leads people to make very bad decisions and to arrive at crazy beliefs. And it’s weird, when you think of it, that humans should be endowed with a confirmation bias. If the goal of reasoning were to help us arrive at better beliefs and make better decisions, then there should be no bias. The confirmation bias should really not exist at all. We have a very strong conflict here between the observations of empirical psychologists on the one hand and our assumption about reasoning on the other.

But if you take the point of view of the argumentative theory, having a confirmation bias makes complete sense. When you’re trying to convince someone, you don’t want to find arguments for the other side, you want to find arguments for your side. And that’s what the confirmation bias helps you do.

The idea here is that the confirmation bias is not a flaw of reasoning, it’s actually a feature. It is something that is built into reasoning; not because reasoning is flawed or because people are stupid, but because actually people are very good at reasoning — but they’re very good at reasoning for arguing. Not only does the argumentative theory explain the bias, it can also give us ideas about how to escape the bad consequences of the confirmation bias.

People mostly have a problem with the confirmation bias when they reason on their own, when no one is there to argue against their point of view. What has been observed is that often times, when people reason on their own, they’re unable to arrive at a good solution, at a good belief, or to make a good decision because they will only confirm their initial intuition.

On the other hand, when people are able to discuss their ideas with other people who disagree with them, then the confirmation biases of the different participants will balance each other out, and the group will be able to focus on the best solution. Thus, reasoning works much better in groups. When people reason on their own, it’s very likely that they are going to go down a wrong path. But when they’re actually able to reason together, they are much more likely to reach a correct solution.

See? I knew there was a reason for continuing to engage with shcb.

Lies.com: Fulfilling the evolutionary imperative for argumentation since 1996.

How Do You _Know_?

Wednesday, December 1st, 2010

A word I like to throw around (especially when someone believes something different than what I believe) is epistemology. That’s the branch of philosophy that deals with the theory of knowledge, or how it is that we know what we know. When I think about epistemology I tend to think about the operation of my prefrontal cortex, as it carries out the so-called “executive function” of my brain: The evaluation of information gathered by the senses (these days, often via the distance-shrinking “perception engine” of the Internet), the use of logic, the rational weighing of evidence, and so on. There is also bias, including the predisposition to believe certain things because they match my a priori belief (see confirmation bias), but here I assume we’re still talking mostly about the prefrontal cortex.

But there is another aspect of knowledge that I too-frequently ignore. That’s the feeling of truth, the sense of certainty that accompanies knowing something. Here I suspect we’re moving beyond the prefrontal cortex into evolutionarily older structures. Where does that feeling come from?

An interesting disorder that may shed some light on this is prosopagnosia, or “face blindness”, in which a person has an inability to recognize faces, even if their ability to perceive the specific differences between one person’s face and another’s remains intact. Even more interesting (at least to me) is the somewhat-related disorder called the Capgras delusion, in which a person becomes convinced that someone they know well (like a close relative or loved one) has been replaced by an identical-looking stranger. In an NPR story from earlier this year (Seeing Impostors: When Loved Ones Suddenly Aren’t), Jad Abumrad and Robert Krulwich spoke with neuroscientist V.S. Ramachandran about a possible explanation for the Capgras delusion:

According to Ramachandran, when we see someone we know, a part of our brain called the fusiform gyrus identifies the face: “That looks like mom!” That message is then sent to the amygdala, the part of our brains that activates the emotions we associate with that person. In patients experiencing Capgras, Ramachandran says, the connection between visual recognition and emotional recognition is severed. Thus the patient is left with a convincing face — “That looks like mom!” — but none of the accompanying feelings about his mother.

Ramachandran holds that we are so dependent on our emotional reactions to the world around us, that the emotional feeling “that’s not my mother” wins out over the visual perception that it is. The compromise worked out by the brain is that your mother was somehow replaced, and this impostor is part of a malevolent scheme.

I see this as tying in with Justin Barrett’s notion of a Hyperactive Agency Detection Device. The idea is that humans have evolved to experience a deep-rooted, powerful sense of “agency” when perceiving certain kinds of phenomena, and (this is important) to do so even in cases when there is no agent. As just one example, in evolutionary terms it may have been beneficial for us to believe that that rustle in the bushes was a large, hungry predator stalking us, rather than the wind, and to believe that viscerally, on an emotional level, rather than treating it as a passing supposition that we might or might not be bothered to act upon. The energy our ancestors wasted by overreacting to windblown leaves was more than made up for, the theory goes, by the survival benefit conferred by being hyperalert to actual threats.

Having evolved this generalized mechanism for “knowing” things that are not necessarily so, we now experience all kinds of interesting consequences: A propensity to believe that the universe was created specifically for us by an imaginary, omnipotent being or beings. A belief that intelligent aliens from other worlds are kidnapping people, taking them aboard invisible spaceships, and subjecting them to anal probes. A belief that some dramatic, emotionally traumatic event (the assassination of John F. Kennedy, the 9/11 attacks) must have been the result of a conspiracy in which our own government was complicit. For a significant subset of the population, these and other conspiracy theories are not merely things that they suspect. They are things that they know.

Jonah Lehrer blogged yesterday about a recent study examining the role of a brain structure called the insula in mediating between physical sensations (like the feeling of warmth or cold one receives from holding a hot or cold object) and a willingness to extend trust to a trading partner: Trust and temperature. I especially liked this part:

We like to see ourselves as Promethean creatures, mostly liberated from this sack of meat we have to carry around for support. (John Updike, as usual, said it best: “We think we are what we think when in truth we are upright bags of tripe.”) But what the insula and these studies of embodied cognition demonstrate is that our mind is impossibly intertwined with carnal changes we can’t explain or comprehend.

I know what I know because my rational mind has analyzed facts and evidence, sure. But that’s not the whole story. The sensations delivered to me by my body — by chemical cues, sensations of warmth and cold, and the murky actions of older, deeper mechanisms that reach me as visceral emotions — play a large part. Perhaps the major part.

I just know it.