Don’t judge me just because I’m biased October 1, 2015Posted by rm42 in Philosophy.
We all like to think of ourselves as fair minded individuals. I would think that even people who are used to lying to others believe that they are honest to themselves. It is therefore surprising to see mechanisms within us that, to a greater or lesser degree, hide the truth from ourselves.
On my previous writings I have discussed how denial as well as pride act sometimes as barriers that shield us from realizations, that while true, may be emotionally undesirable. Today I want to talk about another mechanism that can distort our perception of reality, even when such reality is not necessarily unpleasant. That mechanism is called “confirmation bias”.
One definition for “bias” states: “Bias is an inclination to present or hold a partial perspective at the expense of (possibly equally valid) alternatives.” Notice that Bias is “an inclination”, a tendency. We could compare it to the subtle influence of underwater currents that can deviate a boat from his intended target. I think the image of a judge evaluating evidence before him is one that readily comes to mind when talking about bias. We all dread the idea of having to present a case before a judge that holds a bias against us, whether it is due to our race, social class, background, etc. In that situation, we may be quick to conclude that the judge in question is corrupt or even evil. However, one should not be too quick to judge the morality or sense of justice of someone that holds a bias. As another book1 states: “Bias is not the same as maliciousness or dishonesty. Biases are unconscious assumptions or unrecognized blind spots”. That is right, one that holds a bias may not be aware of it and may otherwise be a very intelligent and fair minded individual.
Of course, there are other forms of biases that may be the result of conscious behavior. For example, there have been many statistical studies commissioned by corporations trying to present a favorable report in which evident bias was present. But those forms of bias are not what concerns us in this article. For the purposes of this writing, we are going to define “biased knowledge” as “the result of unconsciously skewing the information or the meaning of the information before us”.
The subject of bias has long been the focus of study by psychologists. In fact they have come to identify different types of biases. One of the more interesting ones, and perhaps the more prevalent one, is the one known as: “Confirmation Bias”. Wikipedia starts its article on confirmation bias as follows:
“Confirmation bias, also called myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one’s beliefs or hypotheses while giving disproportionately less attention to information that contradicts it.”
Raymond S. Nickerson, in his excellent paper entitled “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”2, demonstrates quite convincingly that this problem of confirmation bias is, not only completely unconscious, but it is even observable in situations that do not engage the person emotionally. And that is very interesting. One would expect bias in situations that involve things we have an affection for, or things we dislike. But if it is found even in situations that we would expect to be able to be perfectly impartial, then something very interesting must be going on in the heuristic techniques we use. That is, the way we use our brains to analyze, understand, and process our observations. Maybe we haven’t learned to use our powerful brains as well as we think.
Of course, it is far outside the scope of this writing to completely explain how confirmation bias works and how to avoid it. I just hope to bring your attention to it so that maybe you can begin to identify it. Latter you can do more research on your own. But, I at least want to mention some interesting aspects of confirmation bias to be aware of. Specifically, Nickerson, through experimental data was able to identify the following tendencies.
How Confirmation Bias affects people:
- People tend to seek information that they consider supportive of favored hypotheses or existing beliefs and to interpret information in ways that are partial to those hypotheses or beliefs.
Additionally, people seek a specific type of information that they would expect to find, assuming the hypothesis is true instead of looking for evidence that would disprove it.
- People tend to give greater weight to information that is supportive of existing beliefs or opinions than to information that runs counter to them. At least, they appear to be less receptive to counterindicative information. They also sometimes appear to give weight to information that is consistent with a hypothesis, even when the same evidence is consistent with alternative hypothesis.
- People tend not to seek and perhaps even to avoid information that would be considered counterindicative with respect to already held hypotheses or beliefs and supportive of alternative possibilities.
- People have a tendency to recall or produce reasons supporting the side they favor—my-side bias—on a controversial issue and not to recall or produce reasons supporting the other side. (This tendency, however, may have motivational reasons as well.)
- Identical evidence tends to be interpreted one way in relation to a favored theory and another way in relation to a theory that is not favored.
- People sometimes see in data the patterns for which they are looking, regardless of whether the patterns are really there. (Why priming people beforehand and stereotyping have so much power.) Confirmation bias can exacerbate the effects of mental maladies such as hypochondria, depression, paranoia, etc.
- Information acquired early in the process of investigation is likely to carry more weight than that acquired later. (This is called the primacy effect.) Another way of putting it is that “people often form an opinion early in the process and then evaluate subsequently acquired information in a way that is partial to that opinion”. This is closely related to (and can perhaps be seen as a manifestation of) the belief persistence effect. Once a belief or opinion has been formed, it can be very resistive to change, even in the face of fairly compelling evidence that it is wrong. Evidently, people are more likely to question information that conflicts with preexisting beliefs than information that is consistent with them and are more likely to see ambiguous information to be confirming of preexisting beliefs than disconfirming of them.
- Most of the beliefs people hold have some basis or another, but because of confirmation bias, many beliefs may be held with a strength or degree of certainty that exceeds what the evidence justifies.
Nickerson goes on to list several examples of confirmation bias in action. He lists, for example, the case of people who see patterns in numbers (numerology), the terrible witch hunts, bogus medicine, and of course, the effects of confirmation bias on jurors. But, perhaps some of the most interesting examples he gives are the ones dealing with scientists. After listing several of these, he states, “the bias is definitely in the direction of giving the existing theory the benefit of the doubt, so long as there is room for doubt and, in some cases, even when there is not”. “The usual strategy for dealing with anomalous data is first to challenge the data themselves. If they prove to be reliable, the next step is to complicate the existing theory just enough to accommodate the anomalous result.”
Well, I hope this is enough to make you curious, and hopefully wary of this most pernicious mental pitfall. In future posts, I hope to be able to use some of this concepts to evaluate some widely held beliefs. Stay tuned.