There exist in our brains two independent sytems for organizing knowledge. Kahneman calls them System One and System Two. System One is amazingly fast, allowing us to recognize faces and understand speech in a fraction of a second. It must have evolved from the ancient little brains that allowed our agile mammalian ancestors to survive in a world of big reptilian predators. Survival in the jungle requires a brain that makes quick decisions based on limited information. Intuition is the name we give to judgments based on the quick action of System One. It makes judgments and takes action without waiting for our conscious awareness to catch up with it. The most remarkable fact about System One is that it has immediate access to a vast store of memories that it uses as a basis for judgment. The memories that are most accessible are those associated with strong emotions, with fear and pain and hatred. The resulting judgments are often wrong, but in the world of the jungle it is safer to be wrong and quick than to be right and slow.
System Two is the slow process of forming judgments based on conscious thinking and critical examination of evidence. It appraises the actions of System One. It gives us a chance to correct mistakes and revise opinions. It probably evolved more recently than System One, after our primate ancestors became arboreal and had the leisure to think things over. An ape in a tree is not so much concerned with predators as with the acquisition and defense of territory. System Two enables a family group to make plans and coordinate activities. After we became human, System Two enabled us to create art and culture.
The question then arises: Why do we not abandon the error-prone System One and let the more reliable System Two rule our lives? Kahneman gives a simple answer to this question: System Two is lazy. To activate System Two requires mental effort. Mental effort is costly in time and also in calories. Precise measurements of blood chemistry show that consumption of glucose increases when System Two is active. Thinking is hard work, and our daily lives are organized so as to economize on thinking. Many of our intellectual tools, such as mathematics and rhetoric and logic, are convenient substitutes for thinking. So long as we are engaged in the routine skills of calculating and talking and writing, we are not thinking, and System One is in charge. We only make the mental effort to activate System Two after we have exhausted the possible alternatives.
System One is much more vulnerable to illusions, but System Two is not immune to them. Kahneman uses the phrase “availability bias” to mean a biased judgment based on a memory that happens to be quickly available. It does not wait to examine a bigger sample of less cogent memories. A striking example of availability bias is the fact that sharks save the lives of swimmers. Careful analysis of deaths in the ocean near San Diego shows that on average, the death of each swimmer killed by a shark saves the lives of ten others. Every time a swimmer is killed, the number of deaths by drowning goes down for a few years and then returns to the normal level. The effect occurs because reports of death by shark attack are remembered more vividly than reports of drownings. System One is strongly biased, paying more prompt attention to sharks than to riptides that occur more frequently and may be equally lethal. In this case, System Two probably shares the same bias. Memories of shark attacks are tied to strong emotions and are therefore more available to both systems.
Kahneman is a psychologist who won a Nobel Prize for economics. His great achievement was to turn psychology into a quantitative science. He made our mental processes subject to precise measurement and exact calculation, by studying in detail how we deal with dollars and cents. By making psychology quantitative, he incidentally achieved a powerful new understanding of economics. A large part of his book is devoted to stories illustrating the various illusions to which supposedly rational people succumb. Each story describes an experiment, examining the behavior of students or citizens who are confronted with choices under controlled conditions. The subjects make decisions that can be precisely measured and recorded. The majority of the decisions are numerical, concerned with payments of money or calculations of probability. The stories demonstrate how far our behavior differs from the behavior of the mythical “rational actor” who obeys the rules of classical economics.A typical example of a Kahneman experiment is the coffee mug experiment, designed to measure a form of bias that he calls the “endowment effect.” The endowment effect is our tendency to value an object more highly when we own it than when someone else owns it. Coffee mugs are intended to be useful as well as elegant, so that people who own them become personally attached to them. A simple version of the experiment has two groups of people, sellers and buyers, picked at random from a population of students. Each seller is given a mug and invited to sell it to a buyer. The buyers are given nothing and are invited to use their own money to buy a mug from a seller. The average prices offered in a typical experiment were: sellers $7.12, buyers $2.87. Because the price gap was so large, few mugs were actually sold.
The experiment convincingly demolished the central dogma of classical economics. The central dogma says that in a free market, buyers and sellers will agree on a price that both sides regard as fair. The dogma is true for professional traders trading stocks in a stock market. It is untrue for nonprofessional buyers and sellers because of the endowment effect. Trading that should be profitable to both sides does not occur, because most people do not think like traders.