Thinking Fast and Slow

Updated: Sep 13, 2020


Daniel Kahneman writes about neuroscience and provides suggestions for improving how we think. Dr Kahneman is an Israeli-American psychologist most known for his work on Prospect Theory for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences. He is widely regarded as one of the world's leading experts on the psychology of judgment and decision-making. In this book, published on October 25, 2011, Kahnman provides insights on how humans make decisions and behavioral biases that lead to poor decision making.


Here are our takeaways organized by the same five parts used in the book. This is more than a summary. We highlight concepts and suggestions that we believe to be the most helpful.

  1. The Two Systems

  2. Heuristics and Biases

  3. Overconfidence

  4. Choices

  5. Two Selves


Part 1: The Two Systems


People use two cognitive systems for thinking. System 1 is our subconscious intuition ... automatically absorbing information and making fast judgments based on familiar patterns. System 2 is our conscious thinking process which given some effort can help us review suggestions from System 1 methodically evaluate problems and new situations. People tend to believe that system 2 is more important, but we actually spend most of our time in System 1 which drives how we see ourselves in the world and helps us to be more creative.


“The main function of System 1 is to maintain and update a model of your personal world, which represents what is normal in it.”


System 1 also holds the potential for developing expertise in handling repeated situations. Many complex tasks that we do every day we do without much effort such as driving. Experts can be described as those who have done a particular task so many times that their intuitions offer up correct suggestions at a far higher rate than none experts for tasks like diagnosing a patient, reading the mood of a potential client during a negotiation, or other tasks that allows us to learn from rapid feedback.


Using system 2 comes with biological and behavior costs. Scientists have measured this cost which shows up biologically as a drop in glucose levels. The behavior cost is an increase in negative actions. For example, those who are using more energy in system 2 are more likely to stereotype, give into temptations make selfish choices, use sexist language, and make superficial judgments. Perhaps because of these costs, humans have evolved to take mental shortcuts called Heuristics to avoid using mental effort leaving system 1 as our default mental state. System 2 monitors suggestions from system 1, but is only lazy, relying on quick intuition rather than deep and thorough investigation.


Persuasion techniques appeal to system 1 because of its dominant role. That is why headlines and commercials are simple, memorable, and employ rhymes and repetition. People are more likely to be persuaded when they observe information that is consistent with their own world view.


“A compelling narrative fosters an illusion of inevitability.”


Our world view is extrapolated from tiny bits of information that is often biased by the uniqueness of our life experience. System 1 further compounds this bias by anchoring to it and filtering out future information that is inconsistent with the past. Narratives and simple explanations help us to remember our world view, but simultaneously over simplify reality. As a result, our world view is far simpler and deterministic than reality.


"Simple explanations play into our desire for control and high self-esteem."


System 2 can theoretically serve as an antidote to these biases, but instead often magnifies the bias by finding reasons to support ones world view through Confirmation Bias. Simple explanations play into our desire for control and high self-esteem. Accepting complexity and "not knowing" is challenging and requires a great deal of courage. Openly not knowing comes at a risk that others may believe us simple or lacking in conviction. Changing of views publicly can undermine our credibility. These risks are hard wired into our psyche.


“Facts that challenge...basic assumptions – and thereby threaten people’s livelihood and self-esteem – are simply not absorbed.”


System 1 absorbs information with little thought to its quality and turns it into the impressions, intuitions, and beliefs. These in turn drive our actions. We habituate to what is normal screening out everyday information like faces on our commute to work. We pay more attention to what is new, especially if it makes us scared or angry do to the role these emotions played in keeping us alive over millions of years. As a result, very unimportant and unrepresentative information like a child abduction in another country can lead to extreme actions like not ever letting a child attend a sleepover or post a family picture online.

We can learn to better coordinate our two cognitive systems by understanding how they work. To start, we explore common heuristics and biases.


Part 2: Heuristics and Biases


Law of Small Numbers - Assuming that the small amount of data you have resembles the total population. Gathering more information is cognitively expensive and so System to will often lazily extrapolate from limited experiences when making decisions. That's why first impressions and experiences are so important.


Anchoring Bias - Allowing an initial value to have an out-sized influence on your prediction of future values or estimate of a quantity. For example, if you starting watching a stock at a price of $50 you are likely to view prices above and below $50 as arbitrarily "expensive" and "cheap".


Availability Bias - Allowing your beliefs about the probability of an event to be influenced by the ease with which you can think of examples. People overestimate the probability of extremely unlikely events like kidnappings, plane crashes, and shark attacks because these events are horrifying and therefore easy to remember from the news and visualize.


“We are confident when the story we tell ourselves comes easily to mind, with no contradiction and no competing scenario. But ease and coherence do not guarantee that a belief held with confidence is true.”


Map is not the territory - Allowing our narratives to mask life's complexity. We all create mental models for how the world works and how to make decisions, but we often conveniently forget all models are wrong because we like to feel all-knowing and in control.Tom W’s Specialty - Allowing ones own priors to override base rate probabilities. For example, stocks in the United States have tended to outperform treasury bonds by an average of 6% a year (base rate), but an investor may speculatively sell stocks for bonds short term because they believe they know something others don't (prior).


Less is more - Allowing persuasive details of a scenario to increase our perceived likelihood of an event. For example, you may believe the probability of higher taxes after the next election to be 20%, but if asked about the likelihood of a specific candidate raising taxes after hearing specific details of their platform you might state the probability even higher only because its easier to visualize the scenario.


Examples vs Facts - Allowing examples to have more influence on our beliefs than than facts and statistics. Kahneman points out that this is why politicians tend to emphasize examples instead of statistics.


Regression to the Mean - Within populations, higher or lower performance tends to not persist. For example, a basketball player that recently made several shots in a row is not more likely to than their base rate to make their next shot.


Base rates with adjustments - Kahneman suggests that we can typically improve our predictions by starting with our base rates and applying adjustments supported by empirical evidence. This process can be further improved by recognizing sources of overconfidence and tools to avoid it.


Part 3: Overconfidence


“Most of us view the world as more benign than it really is, our own attributes as more favorable than they truly are, and the goals we adopt as more achievable than they are likely to be.”


The Illusion of Understanding - Allowing ourselves to believe we understand the reasons for historical events. We do this through narratives that fit some of the historical data and ignoring data not consistent with the narrative. This leads us to a belief that we can accurately predict the future.


“The idea that the future is unpredictable is undermined every day by the ease with which the past is explained.”


The Illusion of Validity - Allowing a feeling of confidence in a judgment based on its coherence to determine if the judgment is valid rather than an objective measure of accuracy such as unbiased data sampling and analysis.Intuitions vs.


Formulas - Kahneman recommends using simple formulas and other structured reasoning over intuition when feasible.


Expert Intuition: When Can We Trust It? - Many years of practiced experience can lead to improved intuition, but only under normal circumstances consistent with hat experience. In other words, don't trust yourself or experts when operating outside a field of expertise.


“Organizations that take the word of overconfident experts can expect costly consequences.”


Think in probabilities - Always plan using a probabilistic view of the world. People tend to rely to heavily on base cases. Instead, make decisions that lead to better outcomes over many possible scenarios.


In general, overconfidence is driven by a suppression of doubt. In addition to the above strategies it can be helpful to have a "premortem" discussion of doubt before making big decisions in order to raise and legitimize doubts.


Part 4: Choices


In part 4 of this book, Kahneman lays out several concepts that are useful when making choices and avoiding errors.


“The way to block errors that originate in System 1 is simple in principle: recognize the signs that you are in a cognitive minefield, slow down and ask for reinforcement from System 2.”