Wednesday, October 9, 2013


A great article from The Skeptic's Dictionary  about self self-deception. Confirmation bias is something that we all have a tendency for. And knowing this, we should be forewarned so that we will make a concerted effort to nullify this tendency in our search for truth:
Self-deception is the process or fact of misleading ourselves to accept claims about ourselves as true or valid when they are false or invalid. Self-deception, in short, is a way we justify false beliefs about ourselves to ourselves.
When philosophers and psychologists discuss self-deception, they usually focus on unconscious motivations and intentions. They also usually consider self-deception as a bad thing, something to guard against. To explain how self-deception works, they focus on self-interest, prejudice, desire, insecurity, and other psychological factors unconsciously affecting in a negative way the will to believe. A common example would be that of a parent who believes his child is telling the truth even though the objective evidence strongly supports the claim that the child is lying. The parent, it is said, deceives him or herself into believing the child because the parent desires that the child tell the truth. A belief so motivated is usually considered more flawed than one due to lack of ability to evaluate evidence properly. The former is considered to be a kind of moral flaw, a kind of dishonesty, and irrational. The latter is considered to be a matter of fate: some people are just not gifted enough to make proper inferences from the data of perception and experience.

However, it is possible that the parent in the above example believes the child because he or she has intimate and extensive experience with the child but not with the child's accusers. The parent may be unaffected by unconscious desires and be reasoning on the basis of what he or she knows about the child but does not know about the others involved. The parent may have very good reasons for trusting the child and not trusting the accusers. In short, an apparent act of self-deception may be explicable in purely cognitive terms without any reference to unconscious motivations or irrationality. The self-deception may be neither a moral nor an intellectual flaw. It may be the inevitable existential outcome of a basically honest and intelligent person who has extremely good knowledge of his or her child, knows that things are not always as they appear to be, has little or no knowledge of the child's accusers, and thus has not sufficient reason for doubting the child. It may be the case that an independent party could examine the situation and agree that the evidence is overwhelming that the child is lying, but if he or she were wrong we would say that he or she was mistaken, not self-deceived. We consider the parent to be self-deceived because we assume that he or she is not simply mistaken, but is being irrational. How can we be sure?
A more interesting case would be one where (1) a parent has good reason to believe that his or her child is likely to tell the truth in any given situation, (2) the objective evidence points to innocence, (3) the parent has no reason to especially trust the child's accusers, but (4) the parent believes the child's accusers anyway. Such a case is so defined as to be practically impossible to explain without assuming some sort of unconscious and irrational motivation (or brain disorder) on the part of the parent. However, if cognitive incompetence is allowed as an explanation for apparently irrational beliefs, then appeals to unconscious psychological mechanisms are not necessary even in this case.
Fortunately, it is not necessary to know whether self-deception is due to unconscious motivations or not in order to know that there are certain situations where self-deception is so common that we must systematically take steps to avoid it. Such is the case with belief in paranormal or occult phenomena such as ESP, prophetic dreams, dowsing, therapeutic touch, facilitated communication, and a host of other topics taken up in the Skeptic's Dictionary.
In How We Know What Isn't So, Thomas Gilovich describes the details of many studies which make it clear that we must be on guard against the tendencies to
1. misperceive random data and see patterns where there are none;
2. misinterpret incomplete or unrepresentative data and give extra attention to confirmatory data while drawing conclusions without attending to or seeking out disconfirmatory data;
3. make biased evaluations of ambiguous or inconsistent data, tending to be uncritical of supportive data and very critical of unsupportive data.
It is because of these tendencies that scientists require clearly defined, controlled, double-blind, randomized, repeatable, publicly presented studies. Otherwise, we run a great risk of deceiving ourselves and believing things that are not true. It is also because of these tendencies that in trying to establish beliefs non-scientists ought to try to imitate science whenever possible. In fact, scientists must keep reminding themselves of these tendencies and guard against pathological science.
Many people believe, however, that as long as they guard themselves against wishful thinking they are unlikely to deceive themselves. Actually, if one believes that all one must be on guard against is wishful thinking, then one may be more rather than less liable to self-deception. For example, many intelligent people have invested in numerous fraudulent products that promised to save money, the environment, or the world, not because they were guilty of wishful thinking but because they weren't. Since they were not guilty of wishful thinking, they felt assured that they were correct in defending their product. They could easily see the flaws in critical comments. They were adept at finding every weakness in opponents. They were sometimes brilliant in defense of their useless devices. Their errors were cognitive, not emotional. They misinterpreted data. They gave full attention to confirmatory data, but were unaware of or oblivious to disconfirmatory data. They sometimes were not aware that the way in which they were selecting data made it impossible for contrary data to have a chance to occur. They were adept at interpreting data favorably when either the goal or the data itself was ambiguous or vague. They were sometimes brilliant in arguing away inconsistent data with ad hoc hypotheses. Yet, had they taken the time to design a clear test with proper controls, they could have saved themselves a great deal of money and embarrassment. The defenders of the DKL LifeGuard and the many defenders of perpetual motion machines and free energy devices are not necessarily driven by the desire to believe in their magical devices. They may simply be the victims of quite ordinary cognitive obstacles to critical thinking. Likewise for all those nurses who believe in therapeutic touch and those defenders of facilitated communication, ESP, astrology, biorhythms, crystal power, dowsing, and a host of other notions that seem to have been clearly refuted by the scientific evidence. In short, self-deception is not necessarily a weakness of will, but may be a matter of ignorance, laziness, or cognitive incompetence.
On the other hand,  self-deception may not always be a flaw and may even be beneficial at times. If we were too brutally honest and objective about our own abilities and about life in general, we might become debilitated by depression.

No comments:

Post a Comment