Home > Work > How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life
1 " When examining evidence relevant to a given belief, people are inclined to see what they expect to see, and conclude what they expect to conclude. Information that is consistent with our pre-existing beliefs is often accepted at face value, whereas evidence that contradicts them is critically scrutinized and discounted. Our beliefs may thus be less responsive than they should to the implications of new information "
― Thomas Gilovich , How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life
2 " People will always prefer black-and-white over shades of grey, and so there will always be the temptation to hold overly-simplified beliefs and to hold them with excessive confidence "
3 " What we believe is heavily influenced by what we think others believe "
4 " it seems that once again people engage in a search for evidence that is biased toward confirmation. Asked to assess the similarity of two entities, people pay more attention to the ways in which they are similar than to the ways in which they differ. Asked to assess dissimilarity, they become more concerned with differences than with similarities. In other words, when testing a hypothesis of similarity, people look for evidence of similarity rather than dissimilarity, and when testing a hypothesis of dissimilarity, they do the opposite. The relationship one perceives between two entities, then, can vary with the precise form of the question that is asked "
5 " We humans seem to be extremely good at generating ideas, theories, and explanations that have the ring of plausibility. We may be relatively deficient, however, in evaluating and testing our ideas once they are formed "
6 " When we do cross paths with people whose beliefs and attitudes conflict with our own, we are rarely challenged. "
7 " How do we distinguish between the legitimate skepticism of those who scoffed at cold fusion, and the stifling dogma of the seventeenthcentury clergymen who, doubting Galileo's claim that the earth was not the center of the solar system, put him under house arrest for the last eight years of his life? In part, the answer lies in the distinction between skepticism and closed-mindedness. Many scientists who were skeptical about cold fusion nevertheless tried to replicate the reported phenomenon in their own labs; Galileo's critics refused to look at the pertinent data. "
8 " Because so much disagreement remains hidden, our beliefs are not properly shaped by healthy scrutiny and debate. The absence of such argument also leads us to exaggerate the extent to which other people believe the way we do. "
9 " We may be particularly inclined to acquire and retain beliefs that make us feel good. "
10 " we believe certain things because they ought to be true. "
11 " Inside accounts of Presidential advisory groups make it clear that the failure to express dissent can have direct, immediate, and severe consequences...Because so much disagreement remains hidden, our beliefs are not properly shaped by healthy scrutiny and debate. The absence of such argument also leads us to exaggerate the extent to which other people believe the way we do. Bolstered by such a false sense of social support, our beliefs strike us as more resistant to subsequent logical and empirical challenge. "
12 " For desired conclusions, we ask ourselves, "Can I believe this?", but for unpalatable conclusions we ask, "Must I believe this? "
13 " When we prefer to believe something, we may approach the relevant evidence by asking ourselves,"what evidence is there to support this belief?"...Note that this question is not unbiased: It directs our attention to supportive evidence and away from information that might contradict the desired conclusion. Because it is almost always possible to uncover some supportive evidence, the asymmetrical way we frame the question makes us overly likely to become convinced of what we hope to be true. "
14 " We hold many dubious beliefs, in other words, not because they satisfy some important psychological need, but because they seem to be the most sensible conclusions consistent with the available evidence. People hold such beliefs because they seem, in the words of Robert Merton, to be the “irresistible products of their own experience.”7 They are the products, not of irrationality, but of flawed rationality. "
15 " A person's conclusions can only be as solid as the information on which they are based. Thus, a person who is exposed to almost nothing but inaccurate information on a given subject almost inevitably develops an erroneous belief, a belief that can seem to be "an irresistible product" of the individual's (secondhand) experience. "
16 " People rate themselves more favorably on amorphous traits like sensitivity and idealism (at the 73rd percentile, on average) than on relatively straightforward traits like thriftiness and being well-read (48th percentile). "
17 " Psychologists have known for some time that rewarding desirable responses is generally more effective in shaping behavior than punishing undesirable responses.19 However, the average person tends to find this fact surprising, and punishment has been the preferred reinforcer for the majority of parents in both modern society19 and in earlier periods. "
18 " Finally, it has been shown that the tendency for people to think of themselves as above average is reduced—even for ambiguous traits—when people are required to use specific definitions of each trait in their judgments.27 "
19 " some evidence has accumulated that people who habitually fail to put the most favorable cast on their circumstances run the risk of depression. "
20 " When people learn no tools of judgment and merely follow their hopes, the seeds of political manipulation are sown.”11 As individuals and as a society, we should be less accepting of superstition and sloppy thinking, and should strive to develop those “habits of mind” that promote a more accurate view of the world. "