Tuesday, August 3, 2010

Some Facts and Confirmation Biases

I am back from a three-week break, a vacation of sorts (if PhD students had the luxury), and tired of the left white margin this blog has had since the beginning. It is now gone, giving more space to the text. Some other minor tweaks are found around, including the complete list of blogs directly from my reader. Please, do note that it is a collection of the serious and the curious, and that I have all kinds of reasons from scholarly pursuits to private amusement to keep following them. Otherwise the visual style of the blog remains as simple as I have always preferred it to be.

Three interesting things I stumbled upon while on vacation.

First, Joe Keohane wrote for The Boston Globe an article titled How facts backfire. It looks like there is some trouble with the cohabitation of our own preferences and the facts:

In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

How far does education help amend the situation? Keohane observes:

A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong.

Insert witty comment here.

What kinds of mechanisms are responsible for such cognitive mishaps? Enter Eric Fernandez's Cognitive Biases - A Visual Study Guide for a great introduction to the field. How about having some Semmelweis reflex, defined as "The tendency to reject new evidence that contradicts an established paradigm". Or would you prefer some Clustering illusion, "The tendency to see patterns where actually none exist"?

Nowadays the biblioblogosphere talks about textbooks with less text (cf. James F. McGrath, and Mark Goodacre for some of the most recent entries). To my aesthetic eye, Fernandez' work - meant to be used as a memory aid for the dozens of different cognitive biases, in an effort to grasp a good understanding of the field as a whole - could serve as a model for biblical study material as well, being informative as well as pleasing to peruse. One curious connection should be pointed out. Fernandez' work has some close ties to the Wikipedia pages concerning cognitive biases i.e. most of the text in the Visual Study Guide is lifted straight out of Wikipedia.

It works because of one thing - that the Wikipedia articles quoted are in top-notch shape, as the case happens to be with e.g. Confirmation bias, thanks to the efforts of Martin Poulter, PhD and the Wikipedia community. In his blog Bias and Belief Poulter writes about the process of writing the Wikipedia entry that was chosen for "Today’s Featured Article" in Wikipedia for Friday 23rd, July. Regarding the topic, he thinks that people are generally not that stupid:

My reading of confirmation bias is different: people are very clever in how they defend belief systems or stereotypes that they are somehow attached to.

Oh, and a situation where people with different views evaluate the (ambivalent) evidence as supporting their particular point of view is known as biased assimilation in cognitive studies. Just for the record.

2 comments:

  1. Pseudo-Sibelius04 August, 2010 00:49

    That's an interesting and informative post on the confirmation bias, Timo. It confirmed what I’ve always believed about people who see things differently than I do. I have to wonder though, why I'm immune to this bias myself.

    Seriously, though, I keep thinking there's an overlooked element here. Although you've fully convinced me of the role that cognitive biases and unabashed pseudoscience play in conspiracy theorizing, I still often get the impression that the authors who develop these theories sometimes proffer arguments that they know to be false. That is, even if they believe their own conspiracy theories (I'm not always sure), they rely significantly on deception in order to convince other people. Do you have an opinion about the role of intellectual dishonesty in conspiracy theorizing?

    ReplyDelete
  2. An interesting question. Once I get my thoughts sorted out, I'm writing a blog post about this. Do people actually believe in some of the wilder theories they espouse? Apparently some do, but the short answer would be 'impossible to determine'. People have all sorts of reasons, from personal vendettas to genuine sanity issues to eccentric senses of humour to argue for various things. There is a modern classic Finnish proverb by Paavo Haavikko, originally concerning politicians.

    "Parody is impossible; they do it themselves."

    It is not quite right. Parody is certainly possible, and happens. But it is usually extremely hard or even impossible to tell parody from the real thing, be it politics, conspiracy theories, or fundamentalist religious attitudes. We can guess, but there's no way to really find out if a given person consciously practices intellectual dishonesty, or if she has simply found a clever way to persuade herself to believe what others consider intellectually dishonest, or if it is even a good idea to try and keep these two separated.

    ReplyDelete