Lessons from: Thinking, Fast and Slow

Name: Thinking, Fast and Slow
Author(s): Kahneman, Daniel

Synopsis

This is of course, a well-known book, so you might already know it's about how humans make decisions and how we can identify certain systems at play that can impact decision making (and correct for them if required). This book is a treasure trove of knowledge on psychology and biases, and my cherry-picked ideas below do not do it justice, if you are interested in the subject then do read the book for yourself. While the book talks through many concepts and cites relevant examples, I found that most of them can be categorized under:
  1. Framing: How a situation is presented (to us or to others) can lead to very different outcomes in how it is interpreted by and subsequently acted upon by people.
  2. Availability: We are lazy and, unless checked, will make decisions based on information that is readily available erroneously assuming it to be all there is and not seeking additional information or even entirely ignoring competing information.
  3. Loss aversion: We will often make irrational decisions so that we can just avoid the uncomfortable feeling of having lost something. We humans, we really like to hang on to what we think we already have.

Core ideas

  1. We have two systems, 1 and 2: System 1 (S1) operates automatically without little conscious effort. System 2 (S2) is effortful and requires conscious allocation of attention (and is interrupted when attention is interrupted). These two systems work together and system 1 often hands off more complex/novel experiences to system 2 to "look at".
    1. S1 runs automatically (and cannot be turned "off") and S2 is usually in a "low effort" mode (and can be turned "off"). S1 suggests things to S2 and if it all goes smoothly then S2 accepts S1's suggestions with no/little modification.
    2. When S1 encounters something it does not have a ready answer to, it calls on S2 (e.g. "2+2=?" will be handled by S1, "76234x563=?" will be handled by S2)
    3. S2 requires attention, we have a limited store: Hence we fail at/find it extremely difficult to focus on more than one activity at once. The invisible gorilla is an amusing example.
    4. S1 takes over in emergencies and assigns total priority to self-protective actions. S2 is in charge of conscious behaviour and one of its jobs is to overcome S1's impulses.
    5. S1 likely came (developed) first and with the right "heuristics" allowed certain species to survive and pass on those traits. S2 probably came second and because it provides an evolutionary advantage of conscious planning and forecasting. But S1 still managed to stick around because life likes to be economical about energy usage and S2 is energy hungry, so in places where stakes are routine/not high then S1 provides an energy efficient (and much faster) alternative to using S2.
    6. S1 likes to be digital (i.e. binary) about things, while S2 can be analogue (i.e. have gradients).
  2. Be on-guard when you know your S2 is "depleted": As S1 has more influence over behaviour when S2 is tired/busy elsewhere/overwhelmed. E.g. indulging in beer after a hard day at work or having chocolate cake for breakfast after a sleepless night or resorting to "standard procedure" instead of thinking through the problem. If you know your S2 is compromised, don't be in tempting/risky situations. Exertion of self-control depletes S2, getting in flow state/meditation might help restore.
    1. S2 itself might itself be divided into two "parts": One for "intelligence" and the other for "rationality". Its often the latter that gets compromised more.
  3. Cognitive ease and strain: When you are in a state of cognitive ease, you a probably in a good mood, like what you see, believe what you hear, trust your intuitions and feel that your current situation is comfortably familiar. You are also likely to be relatively superficial and casual in your thinking. On the contrary, when you are in a state of cognitive strain you are more likely to be more vigilant and suspicious, invest more in what you are doing, feel less comfortable and generally make lesser mistakes but are also less creative / intuitive.
    1. So, if you are able to do things to induce cognitive ease in people you are more likely to get them to do things in your favour (the book does cover a few ways to improve cognitive ease, such as getting people to smile).
    2. Also, be vigilant when you know you are experiencing cognitive ease or strain due to external factors, such as a phrase repeated often so that it seems true (ease) or small font that makes a business plan hard to read (strain).
  4. We are pattern seekers: We are always looking for causal explanations behind experiences/situations to the point of assigning agency to inanimate objects, even if they might not exist. S1 is highly adept at automatically and effortlessly identifying causal connections between events, sometimes even when the connection is spurious.
  5. Mental shotgun: S1 carries out many computations at the same time, however, it often computes much more that is needed and can sometimes even be detrimental to the main goal. Kahneman calls this "mental shotgun". E.g., having a favourable assessment of a company's financial soundness because you like their product.
  6. Substitution: If you are not careful, then if a satisfactory answer to a hard question is not found quickly enough, S1 will replace that question with an easier one, answer it instead and consider the job done. Amazing. E.g., when interviewing a candidate a person can replace the question "Are they well suited for the role?" with "Do they interview well?" and hire them if the answer is yes.
    1. The "affect heuristic" is a good case in point where S1 will short circuit our ability to think rationally by substituting the much harder question of "What do I think about it?" with "How do I feel about it?" and answering that instead - Indeed the author provides examples and experiments where "people  make judgements and decisions by consulting their emotions: Do I like it? Do I hate it? … often without knowing they are doing so".
    2. "Asked to reconstruct their former beliefs, people retrieve their current ones instead ... and many cannot even believe that they ever felt differently."
  7. Priming can be used to solicit desired behaviour out of people: It can happen without even making people consciously aware of the that they are being primed (that is to say, priming phenomena arise in S1, and we have not conscious access to them). Ideas can influence action and the reverse is also true. For example, being amused tends to make people smile, and smiling tends to make people feel amused.
    1. This also means that you can prime yourself - By exposing yourself to content and experiences that are in alignment with your life purpose you can subconsciously nudge yourself to take more conscious action in alignment with your life's purpose. Positive action begets positive action and we do not act the way we think, rather we think the way we act.
  8. Anchors and priming: You should assume that any number on the table has an anchoring effort on you, and if the stakes are high then you should mobilize S2. Anchors can also be used to your advantage. E.g., when asking for a raise if an employee anchors on a higher (but plausible) number than an employee who does not, then the former is likely to get a higher raise than the latter.
  9. Availability heuristic: Is when we judge the frequency of something happening by the ease with which (and not the number of) instances come to mind.
    1. So, if you want people to remember your work(s) as more numerous/consistent, make it easier for them to remember through the use of taglines, slogans and other tools.
  10. Do not ignore base rate: Base rate in this context refers to the overall probability of something being true. The idea is that we end up assigning higher probability to something when we are introduced to representative examples. E.g. you hear about a person soon to join a Fortune 500 company, they are known to be very intelligent, deep thinking and hardworking. What is more likely? - They are a PhD. Holder OR They are a woman. If you chose the first option then you ignored the base rate and were led astray by representativeness (probability of anyone being a woman is ~50% while anyone holding a PhD is much lesser)
  11. Conjunction fallacy: When people judge a conjunction of two events are more likely than only one event in a direct comparison. This can be a trap because adding more detail to arguments makes them more persuasive but at the same time less likely come true (less probable) - Hence, taking decisions based on the persuasiveness of arguments is wrong.
  12. If you want people to learn, then tell it in the form of a story: Statistical results with casual interpretation have a stronger effect on our thinking than non-causal information. But nothing beats relatable stories where causality is established through personal experience.
  13. Things regress back to the mean: The more impressive an original performance, the more likely it is for the subsequent performance to be worse off. This is true for employees and their contribution to work, companies and their annual results, people and their days. However, our minds (S1 specifically) wants to establish clean, near perfect casual relationships between things  so we are surprised when things regress back to the mean. There is more luck involved in smaller samples, so in these cases you should regress more sharply towards the mean. The way to test true causality is through TG/CG testing.
  14. Halo effect + WYSIATI = Bad: Halo effect is when we judge all qualities of a person based on an attribute that is particularly significant. E.g., when we think a handsome person is also likely to be intelligent. WYSIATI (short for "what you see is all there is") is a S1 shortcoming that compels us into dealing with limited information we have as if it were all there is to know. You build the best possible story from the information available to you, and if it's a good story, you believe it. Paradoxically, the less information you have the easier it is to build good stories.
  15. Confidence = Coherence of information + Cognitive ease of processing it: Declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind and not necessarily that the story is true.
  16. Illusion of skill: We attribute far greater causal relationship between skill and success than luck and success, because this allows S1 to make a clean map of the world. It allows us to reduce the anxiety we would experience if we allowed ourselves to fully acknowledge how uncertain the world really is. And when it gets personal i.e. when people are shown data that indicates that much of their success is due to luck – they simply dismiss the evidence. "Facts that challenge such basic assumptions – and thereby threaten people's livelihood and self-esteem – are simply not absorbed."
  17. Algorithms are generally better than intuitions: If you have a strong, empirically proven algorithm at your disposal, use that to make decisions instead of relying on expert judgement.
  18. The planning fallacy begins suboptimal projects, the sunk cost fallacy ensures they continue: The planning fallacy is when people make forecasts that are unrealistically close to best case scenarios and/or could be improved by consulting the statistics of similar cases. The treatment for planning fallacy is "reference class forecasting" which basically means that while forecasting our own ventures we should take information from other similar ventures a.k.a. an outside view.
  19. Optimism is highly valued: Socially and in the market; people and firms reward the providers of dangerously misleading information more than they reward truth tellers.
  20. Possibility effect and certainty effect: Possibility effect is when we overweigh small risks or overweigh small opportunities, and are willing to pay far more than expected value to eliminate them altogether or have a chance to get them respectively. Certainty effect is the opposite where we are willing to pay far more than expected value to make something a certainty. Possibility effect is used to sell lotteries and gambles, certainty effect is used to sell insurance policies.
  21. A rich and vivid representation of the outcome, reduces the role of probability in its evaluation: This means you can add graphic descriptions to low probability events to make them seem more probable. Maybe this is what modern day news channels do after all. Another related concept is "denominator neglect", which means that people tend to ignore denominators when presented with absolute, discrete statistics - so that a headline like "1 out of 100,000 chance of a plane crash" will impact readers more than "0.001% chance of a plane crash".
  22. Disposition effect is when people prefer selling winning stocks (stocks on which they have made money) and hold on to losing stocks (stocks on which they have lost money), whereas they would be better of first checking which stock is likely to rise in the future and which is stalled. Aside, selling the loser will give tax benefit while the winner will need them to pay tax on the gains. Yet, we do not seem to realize this, it’s a prime example of S1 hijacking our mental faculties as it substitutes the question "Which stock is likely to do better in the future?" with "How will selling the loser make me feel?" and answers the latter with "I will feel like a loser, so I should sell the winner instead".
  23. Whenever we deviate from the norm, the default, the status quo - we are likely to feel more regret if something goes wrong than if the same thing goes wrong while we maintained status quo. This makes people avoid taking bets on unproven but promising ventures.
  24. Losses evoke stronger negative feelings than costs: This is the reason why framing is so important as the same result when presented as a loss is more likely to be rejected than when it is framed as a cost. Say, you work at a SAAS firm where you are finalizing your pricing ladder - you know it's better to get people to sign up for longer term plans (quarterly and yearly) versus short term plans (weekly or monthly) so you want to make the annual plan 10% cheaper than the monthly plan on a "per day cost" basis. On the pricing page, what would make more people choose the yearly plan? - Adding a "10% surcharge for monthly subscriptions" underneath the monthly subscription price or "10% discount on yearly subscriptions" beneath the yearly plan price? The former.
  25. The peak-end rule is when people are asked to rate an experience post-facto and they end up rating the event based on the highest intensity the experience reached and the level of intensity it ended at. Therefore, when two people who undergo painful medical procedures where the procedure for one lasts for 10 minutes but has the patient experiencing more pain all the way till the end and the procedure for the other lasts 30 minutes but with relatively lesser peak pain that gradually reduces towards the end of the procedure - the first person is likely to rate the procedure much more painful than the rating given by the second person even though the second person experienced pain three times as longer. Clearly, the "experiencing self" and the "remembering self" are two different people.
    1. Duration neglect is a corollary to the peak end rule which basically says that people tend to ignore the duration of an experience while evaluating it and focus on only the peak moments and the moments at the end.
  26. Focusing illusion: Nothing in life is as important as you think it is when you are thinking about it. Which means any aspect of life to which attention is directed will loom large in a global evaluation.

Notable quotes

  • Expert intuition strikes us as magical, but it is not.
  • A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth. <AND> The familiarity in one phrase of the statement sufficed to make the whole statement feel familiar, and therefore true.
  • Aphorisms were judged more insightful when they were rhymed than when they did not.
  • A good mood loosens the control of System 2 over performance: when in a good mood, people become more intuitive and more creative but also less vigilant and more prone to logical errors.
  • The main function of System 1 is to maintain and update a model of your personal world, which represents what is normal in it.
  • Your mind is ready and even eager to identify agents, assign them personality traits and specific intentions, and view their actions as expressing individual propensities.
  • System 1 is gullible and biased to believe, System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy ... there is evidence that people are more likely to be influenced by empty persuasive messages, such as commercials, when they are tired and depleted.
  • <VERY RELEVANT TO MEETINGS> The standard practice of open discussion gives too much weight to the opinion of those who speak early and assertively, causing others to line up behind them.
  • It is the consistency of information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.
  • The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed.
  • The affect heuristic simplifies our lives by creating a world that is much tidier than reality. Good technologies have few costs in the imaginary world we inhabit, bad technologies have no benefits, and all decisions are easy.
  • Human beings have invented the concept of “risk” to help them understand and cope with the dangers and uncertainties of life. Although these dangers are real, there is no such thing as “real risk” or “objective risk.”
  • An availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action.
  • Democracy is inevitably messy, in part because the availability and affect heuristics that guide citizens’ beliefs and attitudes are inevitably biased, even if they generally point in the right direction.
  • Although it is common, prediction by representativeness is not statistically optimal.
  • Unless you decide immediately to reject evidence (for example, by determining that you received it from a liar), your System 1 will automatically process the information available as if it were true.
  • The experiment shows that individuals feel relived of responsibility when they know that others have heard the same request for help.
  • The test of learning psychology is whether your understanding of situations you encounter has changed, not whether you learned a new fact.
  • Rewards for improved performance work better than punishments for mistakes.
  • Extreme predictions and a willingness to predict rare events from weak evidence are both manifestations of system 1.
  • … we humans constantly fool ourselves by constructing flimsy accounts of the past and believing they are true.
  • Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.
  • A few lucky gambles can crown a reckless leader with a halo of prescience of boldness.
  • Most buyers and sellers <of stocks> know that they have the same information; they exchange the stocks primarily because they have different opinions.
  • Simple, statistical rules are superior to intuitive "clinical" judgements.
  • Intuition cannot be trusted in the absence of stable regularities in the environment.
  • System 1 is often able to produce quick answers to difficult questions by substitution, creating coherence where there is none.
  • Facing a choice, we gave up rationality rather than give up the enterprise.
  • “Pallid” statistical information is routinely discarded when it is incompatible with one’s personal impressions of a case. In the competition with the inside view, the outside view doesn’t stand a chance.
  • When forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs. They spin scenarios of success while overlooking the potential for mistakes and miscalculations. As a result, they pursue initiatives that are unlikely to come in on budget or on time or to deliver the expected returns—or even to be completed.
  • An unbiased appreciation of uncertainty is a cornerstone of rationality - but it is not what people and organisations want.
  • The brains of humans and other animals contain a mechanism that is designed to give priority to bad news.
  • The self is more motivated to avoid bad self-definitions than to pursue good ones.
  • Highly unlikely events are either ignored or overweighted.
  • The probability of a rare event is most likely to be overestimated when the alternative is not fully specified.
  • People tend to be risk averse the in the domain of gains and risk seeking in the domain of losses.
  • We have neither the inclination nor the mental resources to enforce consistency on our preferences, and our preferences are not magically set to be coherent, as they are in the rational-agent model.
  • A commitment not to change one’s position for several periods (the equivalent of “locking in” an investment) improves financial performance.
  • Except for the very poor, for whom income coincides with survival, the main motivators of money-seeking are not necessarily economic. For the billionaire looking for the extra billion, and indeed for the participant in an experimental economics project looking for the extra dollar, money is a proxy for points on a scale of self-regard and achievement.
  • It is the departure from the default that produces regret.
  • As we have seen again and again, an important choice is controlled by an utterly inconsequential feature of the situation. This is embarrassing—it is not how we would wish to make important decisions. Furthermore, it is not how we experience the workings of our mind, but the evidence for these cognitive illusions is undeniable.
  • The mind is good with stories, but it does not appear to be well designed for the processing of time.

In closing

A great book to read multiple times over until you naturally start applying its wisdom in your life. Especially useful to those in any field of work that impacts or at least involves people.

Comments

Popular Posts