Mental Pivot

Notes and observations from a lifelong pursuit of learning.

 
Insights and interesting reads delivered straight to your inbox.
Sign up for the free Mental Pivot Newsletter.

Book Notes: “Thinking in Bets” by Annie Duke

Summary

“Thinking in Bets” by Annie Duke (2018) is a book about embracing a probabilistic mindset and improved decision-making. Duke applies the definition of bet broadly: a bet is any decision, based on our beliefs, that results in the rejection of other possible choices or courses of action. Under this definition, every decision is a bet. Selecting a career is a bet, buying a house is a bet, deciding on what to order at a fancy restaurant is a bet. In most cases, Duke argues, we are not betting against other people, we are “betting against all the future versions of ourselves that we are not choosing. We are constantly deciding among alternative futures.” Since our bets are based on our beliefs, the quality of these bets is contingent on the quality of our beliefs. Duke spends much of the book exploring the question of how to improve the quality of these beliefs to make better decisions. Better decisions cannot guarantee better outcomes, but they do increase their likelihood.

The author enjoyed a successful career as a professional poker player for nearly two decades (she retired from competition in 2012). Unsurprisingly, poker figures prominently in Thinking in Bets. According to Duke, “Poker is a great place to find practical strategies to get the execution of our decisions to align better with our goals. Understanding how poker players think can help us deal with the decision challenges that bedevil us.” The book never delves too deeply into the ins and outs of the game of poker, rather Duke tries to instill the broadly applicable tactics and mental strategies used by successful poker players into the mind the reader. The goal isn’t to teach readers how to play poker, the goal is to instill the transferable skills of poker into everyday life.

A powerful lesson into this poker mindset appears early in the book when Duke discusses the error in conflating the quality of a decision with the quality of an outcome. “Drawing an overly tight relationship between results and decision quality affects our decisions every day, potentially with far-reaching, catastrophic consequences.” Poker players call this thinking error “resulting.” The problem with resulting is that it fails to distinguish between two key ingredients in any outcome: our skill (the things we can control) and luck (the things we cannot control). Resulting can lead to inaccurate assessments about why we were successful. For instance, a stock market investor might attribute financial success to his or her business acumen rather than external market forces and randomness. Alternatively, pundits and Monday-morning quarterbacks might question an objectively defensible decision to make a goal-line pass in the final minutes of the Super Bowl when that decision results in an unlikely and disastrously unlucky outcome (as happened to the Seahawks in Super Bowl XLIX). We overweight skill and underweight luck in many situations. If this idea sounds familiar to readers of this blog, it’s because it’s also explored in Fooled by Randomness by Nassim Taleb, another excellent book on understanding luck, probability, and uncertainty.

Thinking in Bets is a worthwhile read for those uninitiated in probabilistic thinking and rational decision-making. Duke excels in making her ideas accessible and relevant to readers from all walks of life. This work is similar in many ways to Taleb, but far more breezy in tone. Whereas Taleb is more philosophical and prone to pull in examples from ancient history, science, and the investment world, Duke’s prose is easy-going, inclined towards pop culture references, and grounded in the world of poker. It might not be the best book of its kind, but it’s an enjoyable read full of insight—anyone interested in the topic should come away with something interesting to think about.

Pros: Duke’s book is highly accessible (certainly more than Taleb). The first three chapters are especially insightful and info-dense.

Cons: Chapters 4 and 5, while useful, could have been condensed. The second half of the book is not as strong as the first half.

Rating: 8/10


Introduction

  • Thinking in bets helped the author move towards objectivity, accuracy, and open-mindedness.

  • The habit compounds over time and can yield big benefits.

  • Two key determinants in our outcomes: decision quality and luck.

    • Thinking in bets is about understanding the difference between the two.

Chapter 1: Life Is Poker, Not Chess

  • Story: Super Bowl XLIX (2015). Seattle coach Pete Carroll’s controversial decision to call a passing play in a short yardage situation at the end of the game (instead of run the ball). The pass was intercepted and Seattle lost. The public and news media railed against Carroll’s decision calling it the most bone-headed play call ever.

    • From a clock-management standpoint, the call was defensible (allowed the opportunity for an additional play).
    • From a risk standpoint, the call was defensible: the interception rate from that part of the field over 15 seasons was under 2%.
    • Author: “Carroll got unlucky. He had control over the quality of the play-call decision, but not over how it turned out…he made a good-quality decision that got a bad result.”
  • Resulting: Term in poker in which players conflate decision quality with the outcome.

    • Author warns against changing one’s strategy just because the results are poor in the short run.
    • This habit is a result of failing to separate luck from skill.
  • “Drawing an overly tight relationship between results and decision quality affects our decisions every day, potentially with far-reaching, catastrophic consequences.”

  • Hindsight bias: The behavior of viewing an outcome as inevitable after the outcome is known.

    • It is a failure think probabilistically.
  • “Our brains evolved to create certainty and order. We are uncomfortable with the idea that luck plays a significant role in our lives.”

  • “When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation when there is only a correlation, or cherry-picking data to confirm the narrative we prefer.”

    • Reflexive mind: fast, automatic, unconscious thinking (aka Type 1, per Kahneman).
    • Deliberate mind: slow, deliberate, judicious thinking (aka Type 2, per Kahneman).
  • Poker provides a valuable context for Duke’s ideas.

    • Poker players make many decisions under time and financial pressure with a high-degree of uncertainty.
    • Good poker players must navigate the deliberative and reflexive mental systems.
  • John von Neumann: Mathematician who made key contributions to game theory.

    • Theory of Games and Economic Behavior (1944)
    • Roger Myerson defines game theory as “the study of mathematical models of conflict and cooperation between intelligent rational decision-makers.”
    • Von Neumann modeled game theory on poker.
  • Poker vs. chess (from a game theory perspective):

    • Chess has little hidden information and little luck (assuming both players are skilled and competent). When a player loses, it’s because they miss the better move. Despite its complexity, it’s not a good model for decision-making (lacking hidden info and luck).

    • Poker is a game of incomplete information, high uncertainty, and luck. In this, it mirrors real life. It’s possible to get lucky and win or get unlucky and lose.

      • A novice can beat a champion in poker. This will never happen in chess. [Me: Same phenomenon can be found in investing, per Nassim Taleb]
  • Princess Bride (1987 film) anecdote in which Vizzini is operating without full knowledge that his opponent is immune to Iocane powder and both goblets are poisoned. Example of not knowing what we don’t know.

  • Getting comfortable with “I don’t know” is a critical step in becoming a better decision-maker.

    • It offers a more accurate representation of the world.
    • It prevents us from falling into the trap of black and white thinking.
  • “When we think in advance about the chances of alternative outcomes and make a decision based on those chances, it doesn’t automatically make us wrong when things don’t work out. It just means that one event in a set of possible futures occurred.”

    • Reframing low probability outcomes as “you got to see part of the X%.”
    • “The public-at-large is often guilty of making black-and-white judgments about the ‘success’ or ‘failure’ of probabilistic thinking.”
    • Unlikely events happen and even “lower probability” events (e.g., 30%) will happen a lot.
    • An unwanted outcome doesn’t mean our decision was wrong.
  • Remember: This way of thinking works both ways.

    • If our decision isn’t wrong despite a probabilistic outcome, it follows that our decision might not be right despite a good outcome.
    • We must be ready to assess the quality of our decision and the role of luck in both good and bad outcomes.

Chapter 2: Wanna Bet?

  • When we bet on one choice (bet), we effectively reject other choices.

    • A given situation has many possible paths or futures. A choice represents a single path (though multiple outcomes are possible).
    • There is opportunity cost in any choice we forgo.
  • “By treating decisions as bets, poker players explicitly recognize that they are deciding on alternative futures, each with benefits and risks. They also recognize there are no simple answers.”

  • Characteristics of a bet:

    • A choice made by thinking about what will probably happen.
    • The risk of losing something when you try to do or achieve something.
    • Making a decision based on a belief that something will happen.
    • “Overlooked aspects of betting: choice, probability, risk, decision, belief.”
    • “Our decisions are always bets.”
  • Duke applies the definition of bet broadly: “Job and relocation decisions are bets. Sales negotiations and contracts are bets. Buying a house is a bet. Ordering the chicken instead of the steak is a bet. Everything is a bet.”

  • “In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of ourselves that we are not choosing. We are constantly deciding among alternative futures…that the future version of us that results from the decisions we make will be better off.”

  • Our bets are based on our beliefs. Therefore, the quality of our bets is contingent on the quality of our beliefs.

  • How we should form beliefs:

    • We hear something.
    • We think about it and determine if it’s true or false.
    • We form our belief.
  • How we actually form beliefs:

    • We hear something.
    • We believe it to be true.
    • We (might) think about it and determine if it’s true or false.
  • Per research by Daniel Gilbert (psychologist): Our default is to believe what we read and hear [Me: This reminds me of the David St. Hubbins “selectivity” principle from Spinal Tap.]

    • Example: The belief that baldness is passed down from one’s maternal grandfather. We believe this because it is the conventional wisdom. But the belief is not true.
    • Example: “Suited connectors” in poker are considered strong cards, but the reality is that they are not.
    • Example: Low-fat diet was believed to be healthy despite leading to greater obesity and ultimately being debunked.
  • “We form beliefs without vetting most of them, and maintain them even after receiving clear, corrective information.”

  • Truthseeking: The act of seeking the objective truth even if the truth contradicts our current beliefs.

  • Motivated reasoning: The act of seeking out confirmatory evidence and discrediting contradictory evidence.

    • Once we believe something, our tendency is to protect the belief.
    • This is related to confirmation bias.
  • Binary thinking, e.g., seeing beliefs as 100% right or 100% wrong, is problematic. It offers little wiggle room for changing our beliefs and difficulty when confronting new, contradictory evidence.

  • “The smarter you are, the better you are at constructing a narrative that supports your beliefs, rationalizing and framing the data to fit your argument or point of view.”

  • Blind-spot bias: People are better at seeing biased or flawed reasoning in others rather than in themselves.

  • “Wanna bet” as a trigger for thinking probabilistically. When we bet, we must account for many factors in our decision:

    • How do I know this?
    • Where did I get the info?
    • What is the quality and credibility of my source?
    • Is my information up to date?
    • Is my information relevant to my belief?
    • What are the alternative beliefs or explanations?
    • What do I know about the person challenging my belief?
    • What do they know that I don’t know?
    • What am I missing?
  • With a betting mindset, assign a confidence score to the confidence in your belief. Example: “I’m 60% confident that Citizen Kane won best picture.” (rather than an absolute 100% certainty or uncertainty).

    • New information (including opposing info) can be used to update the confidence in your belief.
    • It also invites others to engage as collaborators.
    • The goal is truthseeking rather than affirming preexisting beliefs.

Chapter 3: Bet to Learn: Fielding the Unfolding Future

  • Don’t conflate experience with expertise. Experience might be necessary for expertise, but it’s not a sufficient signal on its own (and plenty of experienced people lack expertise).

  • Aldous Huxley: “Experience is not what happens to a man; it is what a man does with what happens to him.”

  • Learning loop 1: Belief ==> Bet ==> Outcome ==> Next Bet

    • Doesn’t account for luck.
    • Outcomes are feedback.
    • Use the lessons from those outcomes to inform and refine beliefs and future bets.
  • Outcomes are a combination of skill and luck:

    • Skill is when we can predictably generate the same outcome based on our decisions.

    • Luck is when outcomes are the result of things beyond our control. This includes the actions of others, environment, genes, missing information, etc.

    • Example: A car crash.

      • Did we crash our car because we failed to see a traffic light? (skill).
      • Did we crash our car because another driver ran a red light? (luck).
      • [Me: Here skill and luck are different sides of the coin. In the latter, the driver who ran the light lacked skill.]
  • Take care of a common human tendency when ascribing skill or luck to an outcome. “Chalk up an outcome to skill, and we take credit for the result. Chalk up an outcome to luck, and it wasn’t in our control.”

  • Learning Loop 2: Belief ==> Bet ==> Outcome ==> Luck

    • “Outcomes don’t tell us what’s our fault and what isn’t, what we should take credit for and what we shouldn’t.”

    • Duke reminds us to be more thoughtful when assessing the outcome vis a vis our belief.

    • She calls the failure to do this “fielding outcomes.” That is, do you categorize the outcome in the “skill bucket” or the “luck bucket.” (aka self-serving bias)

    • Fielding of outcomes lets us determine which results are due to skill and which are due to luck. We can identify which things are in our control and can be changed in the future to increase the probability of desired outcomes.

      • Remember: Good outcomes do not perfectly correlate with good skill. Bad outcomes do not perfectly correlate with bad luck.
  • “Just as we are almost never 100% wrong or right, outcomes are almost never 100% due to luck or skill.”

  • We cannot learn (effectively) from our experiences if we cannot distinguish between luck and skill.

  • We also need to recognize the dynamic of skill and luck in others’ experiences.

  • “There are people who, like Phil Ivey [poker champion], have substituted the routine of truthseeking for the outcome-oriented instinct to focus on seeking credit and avoiding blame.”

    • Aim for accurate self-critique [Me: reminds me of Jocko Willinck’s Extreme Ownership]
  • One way to stand out and be different: “Keep the reward of feeling like we are doing well compared to our peers, but change the features by which we compare ourselves: be a better credit-giver than your peers, more willing than others to admit mistakes, more willing to explore possible reasons for an outcome with an open mind...”

Chapter 4: The Buddy System

  • Truthseeking is beneficial in small groups, but all participants have to agree to the objective of truthseeking.

    • This isn’t a behavior that’s appropriate in all contexts.
    • Sometimes someone just wants to vent. As a listener, you need to know (or ask explicitly) if they want advice or just a friendly, non-judgmental ear.
  • “Being in a group can improve our decision quality by exploring alternatives and recognizing where our thinking might be biased, but a group can also exacerbate our tendency to confirm what we already believe.”

    • Aim for exploratory, over confirmatory conversations.
    • Confirmatory: A one-side attempt to rationalize a point of view. Promotes biased thought.
    • Exploratory: An attempt to consider other ideas and points of view. Promotes objective thought.
  • A group agreement is a set of ground rules for promoting exploratory thought. Participants are accountable to peers:

    • Whose views are unknown.
    • Who are interested in accuracy.
    • Who may be well-informed.
    • Who have legitimate reasons for exploring the topic.
    • Who may represent a diversity of thought/ideas.
  • “We don’t win bets by being in love with our own ideas. We win bets by relentlessly striving to calibrate our beliefs and predictions about the future to more accurately represent the world.”

  • “A bet is a form of accountability. If we’re in love with our own opinions, it can cost us in a bet.”

  • “Evidence that might contradict a belief we hold is no longer viewed through as hurtful a frame. Rather, it is viewed as helpful because it can improve our chances of making a better bet.”

  • John Stuart Mill (from On Liberty, 1859): “The only way in which a human being can make some approach to knowing the whole of a subject, is by hearing what can be said about it by persons of every variety of opinion, and studying all modes in which it can be looked at by every character of mind.”

  • Questions for examining our beliefs:

    • Why might my belief not be true?
    • What evidence contradicts my belief?
    • Are there sources I have missed that could inform my belief?
    • Have I overemphasized other sources in my belief formation?
    • What is the basis for a contradictory belief? Why might they be more right than me?
    • What are other possible explanations for some outcome?
  • “Well-deployed diversity of viewpoints in a group can reduce uncertainty due to incomplete information by filling in the gaps in what we know...”

    • Groupthink and confirmatory drift occur when a narrow set of viewpoints are considered.
    • Duke highlights different judicial outcomes in federal courts where even a single dissenting opinion is present (e.g., 2 Democratic judges and 1 Republican).
  • “In political discourse, virtually everyone, even those familiar with groupthink, will assert, “I’m in the rational group exchanging ideas and thinking these things through. The people on the other side, though, are in an echo chamber.””

  • Heterodox Academy is an academic organization formed to fight against intellectual homogeneity and confirmatory thought.

Chapter 5: Dissent to Win

  • Robert K. Merton’s CUDOS framework, a model for collaborative scientific inquiry:

    • Communism wherein the data belongs to the group (do not confuse this with the political and economic ideology). Data sharing promotes transparency and understanding.
    • Universalism wherein uniform standards to claims, evidence, and data must be upheld regardless of source or provenance.
    • Disinterestedness is the requirement to remain unbiased and avoid conflicts of interest.
    • Organized Skepticism is the desire for productive engagement and dissent.
  • Rashomon Effect: The idea that the same event can be seen and experienced by multiple individuals whose recollections and conclusions drawn from the event can differ significantly. This occurs due to the different facts and perspectives available to different individuals. The phenomenon is named for a 1950 Akira Kurosawa film Rashomon.

    • “We cannot assume one version of a story is accurate or complete.”
  • “Sharing data is the best way to move toward accuracy because it extracts insight from your listeners of the highest fidelity.”

    • The more detail and information available to evaluate, the more likely a good quality decision can be made.
  • “Don’t disparage or ignore an idea just because you don’t like who or where it came from.”

    • The accuracy of an idea or belief should be evaluated independent of its source.
    • When interacting with people with different belief systems, look for areas of common ground or agreement as well as areas of disagreement.
  • “We are not naturally disinterested. We don’t process information independent of the way we wish the world to be.”

    • Beware the “should” vs “is” trap: this is the classic normative vs. positive view of the world.
    • All aspects of decision-making, belief, and even scientific inquiry are prone to bias. For instance, scientific measurements might be accurate but consider how the data was selected, divided, categorized, evaluated.
    • Example: If we ask for someone to evaluate a decision AND tell them the outcome. Knowledge about the outcome will color the evaluator’s opinion of the decision.
  • “Skepticism is about approaching the world by asking why things might not be true rather than why they are true.”

    • Be just as skeptical of confirmatory information or evidence.
    • Productive skepticism asks “Are you sure about that?” and “Have you considered...” rather than “You’re wrong!”
  • Tips for communicating with people beyond your truthseeking group:

    • Start with uncertainty.

    • Lead with assent (find common ground).

      • Use “and” rather than “but.”
      • Supplement rather than negate.
    • Ask for a temporary agreement to engage in truthseeking.

    • Focus on the future.

      • “It’s harder to get defensive about something that hasn’t happened yet.”
      • “Do you think there’s anything you can do to improve [desired outcome] in the future?”

Chapter 6: Adventures in Mental Time Travel

  • Author discusses the idea of changing perspective between past, present, and future versions of ourselves to arrive at better decisions (that consider not only present considerations, but also future considerations).

  • “Just as we can recruit other people to be our decision buddies, we can recruit other versions of ourselves to act as our own decision buddies.”

  • “Improving decision quality is about increasing our chances of good outcomes, not guaranteeing them.”

  • Jerry Seinfeld story about “night Jerry” vs. “morning Jerry”: “I stay up late at night because I’m Night Guy. Night Guy wants to stay up late. ‘What about getting up after five hours of sleep?’ That’s morning Guy’s problem...Night Guy always screws Morning Guy.”

  • Temporal discounting is the act of favoring the needs and desires of our present-self at the expense of our future-self.

    • “We are willing to take an irrationally large discount to get a reward now instead of waiting for a bigger reward later.”
    • Example: Saving for retirement is a temporal discounting problem. So is eating the marshmallow in the famous “Marshmallow Test.” Going into debt is another example.
  • “Bringing our future-self into the decision gets us started thinking about the future consequences of those in-the-moment decisions.”

  • 10-10-10: A tool developed by journalist Suzy Welch. It’s a strategy for involving our future self in a decision. Ask yourself, when making an important decision:

    • What are the consequences of this decision in 10 minutes?
    • In 10 months?
    • In 10 years?
  • Plan ahead and anticipate possible outcomes, both negative and positive. This allows us to further refine our decision-making and prepares our future response to a broader range of outcomes.

  • Rational perspective is important for overemphasizing individual moments or outcomes in the present.

    • Example: A flat tire might be annoying in the present, but in the grand scheme of things it’s unlikely to result in lasting unhappiness or problems.
    • Example: Rather than watching minute-by-minute stock price movements (or even daily ones), consider the long-view of annual or even decade movements.
    • Author recommends viewing happiness as a long-term stock holding.
  • Tilt is an emotionally unhinged mental state that impairs decision-making (it often results in us blowing something out of proportion and a strong emotional reaction). Learn to recognize and prevent tilt to avoid poor decision-making.

  • Ulysses contract: a way of helping your future self by binding yourself to a predetermined decision-path. When traveling past the island of the Sirens, Odysseus has his crew fill their ears with beeswax and ties himself to the mast (because he wants to listen to the Sirens but not succumb to them).

    • These contracts can be barriers or obstacles to certain actions (barrier-inducing).
    • They can also lower barriers that interfere with rational responses (barrier-reducing).
  • More examples and signposts of when we are engaging in poor thinking and fielding errors:

    • Expressing certainty: “I know,” “I’m sure,” “It always happens this way,” “There’s no way that’s true!”
    • Blaming bad luck: “I can’t believe how unlucky I got!” and alternatively, “I’m at the top of my game!”
    • Dismissing the ideas of others based on ad hominem. “What an idiot.” “He’s a gun nut.” “He’s so East Coast.”
    • Signals that we are overemphasizing the present: “This is the worst day ever!”
    • Expressions that signal motivated reasoning: “Conventional wisdom,” “common-sense thinking,” “ask anyone.”
    • The word “wrong.” Per the author wrong is a conclusion not a rationale.
    • Lack of self-compassion: “I should have known” and “How could I be so stupid?”
    • Words that discourage collaboration. Remember the use of “and” (supplemental) vs. “but” (negating).
  • Scenario planning: Consider the set of possible outcomes for a given scenario and then study the probabilities of those outcomes.

    • Expected value is a forecasting tool that combines probable outcomes along with their expected payouts. It can be used to think probabilistically about opportunities and bets.
    • Expected value example: A $100,000 grant with a 25% probability of winning the award has an expected value of $25,000. [Me: [Naked Statistics by Charles Wheelan], chapter 5, offers a better explanation of expected value ([link])].
  • Backcasting: Act of working backwards from a goal. You start at the end (assuming a positive outcome) and then determine the essential steps to arrive at that goal along with the potential obstacles and issues preventing you from achieving the goal. [Me: This is similar to a process that Amazon famously uses when developing products.]

  • Premortems: These are similar to backcasting, but they assume a negative outcome. Premortems are a complement to backcasting (just via the lens of a different scenario).

  • Hindsight bias: Views outcomes, retroactively, as inevitable. This is the opposite of probabilistic thinking.

    • Hindsight bias makes it difficult to recognize that the outcome was one of many possible futures.


Get weekly email updates and additional content: Sign up for the free Mental Pivot Newsletter.