Thinking Fast and Slow — Summary

Learn the two sides of our thinking. Automatic, fast, intuitive, and emotional vs slower, more deliberative, and more logical reasoning.

Our mind is divided into two systems. System 1 the automatic and system 2 the deliberate one.

System 1, the automatic system, is the one which makes you react when you hear a loud noise or touch a hot stove. This system is programmed by evolution to provide survival advantages. It operates automatically and very quickly, with no effort and voluntary control.

System 2 operates whenever we make a conscious choice and perform effortful mental activities. The operations of system 2 are associated with subjective experiences like concentration.

The lazy deliberate system

One of the main tasks of system 2, the conscious system, is to monitor and check the suggestions of system 1.

Here is a puzzle, try to solve it without thinking too much, answer trough your intuition.

A bat and ball cost $1.10.
The bat costs one dollar more than the ball. How much does the ball cost?

You probably came up with $0.10 which is wrong. Do the math, and you will notice that the price is actually $0.05. If it were 0.10 the total amount would be 0.10 + 1.10 = 1.20.

This puzzle was used by the author and Shane Frederick to study how closely system 2 monitors system 1.

System 2 endorsed the answer of system 1 in people who gave the wrong answer. Also, they missed an important social cue. They didn’t ask themselves why someone would include such an obvious answer in a questionnaire puzzle.

The cat-and-ball problem exposes the innate mental laziness of our conscious system. Checking the answer of system 1 would have meant more energy, as our brain tries to use the minimum amount of energy necessary, it often goes with the answer of system 1. This is known as the law of least effort.

The lesson: Don’t avoid system 2, it is an important aspect of our intelligence. If we don’t use it, our mind is limited in strength and intelligence.

Priming, triggering related information

What do you think when you see this word fragment: “SO_P” ? Probably not much, now first consider the word “EAT.” Now, look back at the word “SO_P,” you probably completed it as “SOUP.” If you had seen the word “WASH” you’d probably come up with the word “SOAP” instead.

This process is known as priming. EAT primes the idea of SOUP and WASH primes the idea of SOAP. Priming occurs when we are exposed to words, concepts or events which cause us to summon related information.

But priming not only works with words. You have to accept the alien (scary) idea that even your actions and emotions can be primed by events of which you are not even aware.

In a famous experiment, psychologist John Bargh and his collaborators asked students from the New York University to assemble four-word sentences from a set of five words. For one group half of the words were words associated with the elderly such as forgetful, gray, wrinkle. After completing the task, the young were sent to do another experiment in an office down the hall. What they didn’t know was that the walk was what the experiment was about. The researchers measured meticulously the time the students took to get from one office to the next.

As Bargh had predicted, the students who had assembled the words associated with the elderly, walked at a significantly slower pace than the others.

The “Florida effect” occurs in complete unawareness. It has two stages, first the set of words which prime the thought of old age without even mentioning the world old. The second phase is the triggered behavior of the thoughts, walking slowly.

When the students were asked later, none of them reported having noticed a common theme in the words and they argued that nothing they did after the first experiment could have been influenced by the words they had encountered.

The “Florida effect” also works in reverse. A German university did an experiment where they asked students to walk slowly, about one-third the normal pace, for 5 minutes. After that, the students were much quicker to identify words related to old age.

Reciprocal links are common in associative networks like smiles for instance. Smiles influence our thoughts and feelings and vice-versa, as studies have shown. So the common admonition to act kindly regardless of how you feel might is actually good advice.

Jumping to conclusions.

The bias to Believe and Confirm

The author of “Stumbling to Happiness” once wrote an essay titled “How Mental Systems Belive” in which he developed a theory of believing-unbelieving that he traced back to the philosopher Baruch Spinoza of the seventeenth-century.

What Gilbert proposed was that understanding a statement must begin with an attempt to believe it, only then one can decide whether or not unbelieve it. The initial attempt to form a mental image (believe) is necessary to know what the idea would mean if it were true, it is performed by the automatic system, System 1.

Gilbert sees unbelieving as part of System 2 and he performed an experiment where the participants were showed nonsensical assertions followed by one of the single words “true” or “false.” They were then tested on their memory of the “true” labeled sentences. Then the subjects were required to hold digits in memory while performing the task, after which they reported many false statements as true.

The outcome shows that system 2 is responsible for unbelieving and when otherwise engaged or lazy, system one’s bias to believe will go unchecked. There is indeed evidence that when people are in a tired and depleted state, they are more susceptible to persuasive messages such as commercials.

Confirmation bias

Associative memory leads to a general confirmation bias. If someone asks you if Bob is friendly, different instances of Bob’s behavior will come to your mind as if you were asked: “Is Bob unfriendly ?” The tendency to search for confirming evidence is known as positive test strategy, and it is how system 2 tests a hypothesis. This has implications not only in decisions but even in science, where contrary to what philosophers of science suggest — trying to refute hypotheses — scientists often seek data which will likely support their current beliefs and hypotheses.

The Halo Effect

Our minds tend to jump o conclusions without us having enough information, which leads to judgment errors. The tendency to like or dislike everything about a person (including things we’ve not observed) is known as the halo effect or Exaggerated Emotional Coherence effect. Information is missed, and thus our mind makes an emotional guess.

These phenomena happen without our conscious awareness and affect our judgments and actions.

Heuristics: Shortcuts for quick decisions.

Heuristics are shortcuts of our brains use to make decisions easier and faster. They are very useful and necessary most of the time, but we tend to overuse them.

One such heuristic is the availability heuristic. In which we overestimate the probability of something we hear about often. For example, if we associate a greater risk of car accidents than strokes, even if statistics show the opposite, we assign a greater risk to accidents because we get more exposure to them from the media.

The substitution heuristic substitutes one question with an easier one. For example, if someone asks you “how happy are you with your life?” you might substitute the original question with “How am I feeling right now?”
Or “Is this woman going to be successful in politics?” might be replaced with “Does this woman look like she could be successful in politics?”

Cognitive ease and strain

There are two dials in our brain, either we experience cognitive ease or strain. Cognitive ease means that everything is ok and normal and effortless. We are more intuitive, happier but also prone to mistakes.

Cognitive strain means that increased mobilization of system 2 is required. Judgments are double checked. We are less creative but less error prone.

We can choose the right frame for different tasks, for example for persuasion the cognitive ease combined with enough repetition would be perfect. Our minds have evolved to respond positively to repeated clear messages.

The cognitive strain frame, in contrast, will help us for rational reasoning like mathematics.

People tend to react differently to relative frequency than to statistical probability

When two groups are presented with the same statistic, the group who is presented with relative frequency tends to react more emotionally than the group presented with probability.

The Mr. Jones experiment showed this when the first group of psychiatrists in a hospital was told that “Mr. Jones had a 10% chance of committing a violent act” while the second group was told that “of every 100 patients similar to Mr. Jones, 10 are likely to commit an act of violence.” The respondents of the second group denied his discharge twice as often as the first group.

Even if both probabilities were the same, the way the second probability is presented produces a more vivid image of a violent crime in the minds of the participants which is why they respond differently than the group presented with raw statistical probability.

Loss aversion

We are often to make emotional or “gut” decisions instead of rational ones. The author developed prospect theory, which challenges utility theory (the theory which says that we always make a rational choice which brings us the highest utility.)

Imagine this scenario: You are offered a gamble on the toss of a coin.
Tails, you lose $100.
Heads, you win $150
Would you accept this gamble?

The gamble would have an expected positive outcome as you would win more than you lose. But people tend to feel the fear of losing $100 more than gaining the $150.

Many similar observations have lead to the conclusion that people fear losses more than they value gain. In other words, we have a loss aversion.

Imagine another example where you’d win $200 on tails and lose $100 on heads.

Most people prefer to gamble in this scenario. That is because of the “loss aversion ratio” which defines the balance of win and loss you are willing to take. In this case, if you’d accept to gample, that would mean that your loss aversion ratio is 2. For most people, it is 1.5 to 2.5. Loss aversion tends to increase when the stakes rise.