Three interesting things I stumbled upon while on vacation.
First, Joe Keohane wrote for The Boston Globe an article titled How facts backfire. It looks like there is some trouble with the cohabitation of our own preferences and the facts:
In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
How far does education help amend the situation? Keohane observes:
A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong.
Insert witty comment here.
What kinds of mechanisms are responsible for such cognitive mishaps? Enter Eric Fernandez's Cognitive Biases - A Visual Study Guide for a great introduction to the field. How about having some Semmelweis reflex, defined as "The tendency to reject new evidence that contradicts an established paradigm". Or would you prefer some Clustering illusion, "The tendency to see patterns where actually none exist"?
Nowadays the biblioblogosphere talks about textbooks with less text (cf. James F. McGrath, and Mark Goodacre for some of the most recent entries). To my aesthetic eye, Fernandez' work - meant to be used as a memory aid for the dozens of different cognitive biases, in an effort to grasp a good understanding of the field as a whole - could serve as a model for biblical study material as well, being informative as well as pleasing to peruse. One curious connection should be pointed out. Fernandez' work has some close ties to the Wikipedia pages concerning cognitive biases i.e. most of the text in the Visual Study Guide is lifted straight out of Wikipedia.
It works because of one thing - that the Wikipedia articles quoted are in top-notch shape, as the case happens to be with e.g. Confirmation bias, thanks to the efforts of Martin Poulter, PhD and the Wikipedia community. In his blog Bias and Belief Poulter writes about the process of writing the Wikipedia entry that was chosen for "Today’s Featured Article" in Wikipedia for Friday 23rd, July. Regarding the topic, he thinks that people are generally not that stupid:
My reading of confirmation bias is different: people are very clever in how they defend belief systems or stereotypes that they are somehow attached to.
Oh, and a situation where people with different views evaluate the (ambivalent) evidence as supporting their particular point of view is known as biased assimilation in cognitive studies. Just for the record.