Monday, March 24, 2008

Explaining Bias

The ability of fMRI scans to detect which modules of the brain are active during cognitive processes provides a crude, but nonetheless revealing window into how we “think”: it allows testing whether some of our gross assumptions are true or not.

For example, a widely referenced July 2006 Scientific American “Skeptic” column by Michael Shermer proclaimed, : “A recent brain-imaging study shows that our political predilections are a product of unconscious confirmation bias.”

The column related that shortly before the 2004 presidential election fifteen subjects who described themselves as “strong Republicans” and fifteen who described themselves as “strong Democrats” underwent fMRI brain scans while being asked to assess statements by George W. Bush and John F. Kerry in which the candidates appeared to contradict themselves. In all cases the subjects were critical of the candidate they opposed, and spun explanations excusing the candidate they supported. … What was surprising, however, was what the fMRI scans revealed: parts of the brain associated with processing emotions, resolving conflict, and making moral judgments were activated, but the part of the brain associated with reasoning was not. When the subjects finally arrived at a conclusion that satisfied them, the part of the brain associated with reward and pleasure was activated.

Because of this and other studies, it is now generally accepted that confirmation bias exists, that it is caused by unconscious processing in the brain, and that it causes us to favor data confirming our beliefs/theories and to ignore or discount data challenging them.

Awareness of confirmation bias is hardly new, as demonstrated by the Francis Bacon quote accompanying Shermer’s column:

“The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises ... in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate.” --Francis Bacon, Novum Organum, 1620

What has changed, however, is that when Bacon and others observed the phenomenon it could not be “proven”: it was not science … it was only an interesting opinion. Now, with the advent of fMRI and other tools, it is possible to test theories about unconscious processing in the brain, although admittedly still crudely.

One of the theories “roughly supported” by the fMRI data is that the brain processes ideas similarly to the way it is known to process vision: that is, discrete features of the raw input are separated and processed in parallel by numerous brain modules, and the output from the modules is “somehow” merged to produce the colorful, continuous, 3-D vision we experience. Whenever the brain is unable to successfully merge all of the output, however, it “makes up” a possible or probable solution and presents it to us as reality – the standard, repeatable “optical illusions” we’re all familiar with.

Knowing that the brain processes visual data in this way – and observing the distributed processing disclosed by fMRI – makes it reasonable to hypothesize that the brain processes ideas by weighing them against existing beliefs and biasing our judgment of the ideas accordingly. … Interestingly, this would also explain the time delays that Jung observed during the word-association tests that led to his theory of complexes; the data could be reinterpreted to describe, not complexes, but complex, time-consuming parallel processing going on to resolve the competing, sometimes conflicting ideas evoked by the words. (Although various word-association tests have already been conducted during fMRI, I am not aware of any aimed at exploring Jung’s studies.)

An awareness that unconscious bias exists in all of us, scientists as well, helps to explain the not infrequent battles that arise whenever new data begins to conflict with accepted theory. It is normal, and unfortunately natural, for proponents of an existing theory or viewpoint to fiercely reject without real examination any data that might challenge the status quo. Hopefully, the growing awareness that we are all “wired” to do this will cause us to consciously resist our impulse.

No comments: