Thinking in Bets is a book about how to make better decisions when you don’t have all the facts. People often say life is like chess but really it’s more like Poker because there is information asymmetry and luck. Lots of good insights in this book and I would strongly recommend following Annie Duke on Twitter and me @tomiwa1a.
She does such a good job of explaining human behavior using data, stories and humor. My favorite part of the book is that she doesn’t just explain human behavior she also offers practical solutions on how to make better decisions and improve your behavior pattern as you will see in my notes below. I think she could write a really good book on relationships/dating/marriage and it would be popular with both men and women.
Also, I’ve been thinking a lot about how video is now outperforming text. I tried to fight/ it at first but I’ve learnt that it’s better to embrace and adapt to change. If you would like a video review of “Thinking in Bets: Book Review and Top 5 Takeaways” tweet me and let me know @tomiwa1a!
- “Resulting”: Judging a decision based on the result instead on if the process you followed was right. For example, Pete Carrol’s decision to pass the ball in 2016 Super Bowl 
- I disagree with the argument that Pete Carrol had the right decision making process, the risk-return was asymmetric. I think her examples of firing a bad CEO but sales going down or drunk driver getting home safely are better examples of people using resulting, but I get the idea
- Daniel Kahneman, thinking fast and slow system 1 and system 2, similar to what Dalio said in Principles (see my review) 
- Theory of Games and Economic Behavior by John von Neuman and introduction to game theory 
- I like how she said that life is like Poker and not chess because in life you have 1. incomplete information 2. Luck
- So in life there is no objectively correct answer and it is not deterministic 
- Clever trick and funny story from The Princess Bride about deception and game theory 
- Sometimes it’s good to say “I don’t know” 
- Very interesting analogy about how people said Brexit Bookies were “wrong” because they said Brexit had a 30% chance of happening 
- Also, bookmakers don’t set odds based on the favorites necessarily, they set the odds so that the payoff is such that the losers pay the winners, they remain market neutral and just take their fee
- Fascinating story on John “Johnny World” Hennigan, who was a poker player in the 1990s that took a bet he could live on a street in Des Moines, Iowa for a month for $30,000. When he tried to exit the bet, asked THEM to pay him $15K, ended up paying $15K to exit the bet. 
- Interesting stories and anecdotes of humans not willing to change their mind when they get new information [52, 55]
- Interesting story of the 1951, Princeton vs Dartmouth football game and how both sides surveyed acted like they were pretty much watching two different games, similar to the Israel-Palestine article and both people interpreted the same story completely differently.
- Fake news has always existed and was practiced by people like Orson Welles, Joseph Pulitzer (yes, the guy they named the Pulitzer prize after), William Randolph Hearst 
- The reason it is effective because it is telling you things that you already WANT to believe
- Smarter people can be more susceptible to this because, the smarter you are the better you are at rationalizing and constructing a narrative frame of whatever you want to believe 
- West, Meserve and Stanovich test the blind-spot bias, we are better at detecting flaws in others than in ourselves and the smarter you are, the greater your blind-spot bias 
- Learning loop 1 image  reminds me of one of the feedback loops in Ray Dalio’s principles
- Nick the Greek story , who had a strongly held belief that playing bad poker hands was a good strategy because people didn’t expect it (I’m not kidding). Her stories about the different characters in her book, reminds me of Fat Tony and Nero Tulip in Nassim Taleb’s AntiFragile (highly recommended)
- Interesting story about Chicago Cubs 2003 World Series Bartman Play: guy goes for the ball catches it, everyone blames him for Cubs losing world series even though other people went for the ball and Cubs had multiple chances to win the series. Cubs won the world series in 2016, they offer him a championship ring and he accepts and say it should be a reminder of how we treat each other 
Thinking in Bets Groups
- Big problem with confirmation bias and groupthink in social psychology pointed out by Jon Haidt, 85-96% of members self-identify as left of center and most of the remaining 4-15% identify as moderate or centrist rather than conservative 
- In the 90s liberals outnumber conservatives 4-1, recent surveys show that ratio has grown to 10-1
- Jon Haidt, Phillip Tetlock, Jose Duarte, Jarret Crawford, Lee Jussim found the Heterodox academy to “fight drift towards homogeneity of thought” 
- They publish a paper and talk about steps to allow more diversity of opinion in your group, author suggests following more people with opposing viewpoints to you on twitter , I really like how she concluded this section
- Absolutely fascinating section on how using betting markets instead of peer review can actually lead to better, more objective scientific research  [tomiwa reread this section]
- The idea is that by having scientists bet on if their experiments are reproducible it reduces bias because a major problem is when people publish a paper like “drinking Kombuchar, Keto, Paleo can reduce fat by 20%” but other scientists can’t reproduce those results ,
- The Reproducible Project aims to solve this , experts using peer review when asked if a given experiment would replicate, were right 58% of the time, when the traders were experts and had money on the line they were right 71% of the time. Another classic example of “Skin in the Game” which I think may be Nassim Taleb’s greatest intellectual contribution [Link to BBS paper] 
- Fascinating story of Robert K Merton, who did pioneering work in how to get diversity of opinion in scientific groups around 1950s 
- CUDOS  [explain what this is]
- In 1967, scientific community were split on if sugar or fat was causing heart disease, three Harvard scientists published a report in the New England Journal of Medicine saying that fat was the culprit 
- The effect of this paper had massive effects on American’s eating habits, an article published in 2016 shows that the sugar industry had paid the scientists to write the paper
- That is a financial and very clear example of conflict of interest, but we do this in other ways. We don’t process information independently of how we wish the world to be 
- Richard Feynman found that if we knew or could intuit the hypothesis being tested, the analysis is more likely to support the hypothesis
- This is why I am suspicious of most “research papers” in the social sciences field that somehow always reach conclusions that align with the author’s personal opinion
- Robert MacCoun and Saul Perlmutter, in a 2015 nature article mention outcome-blind analysis which is used in particle physics and cosmology but not biological, psychological and social sciences
- The idea is that you introduce a random variable so that those analyzing data cannot surmise [word choice?] the outcome the researcher is hoping for
- Sometimes it is best to analyze a strategy before you know what the outcome of your strategy is 
- Sidebar: I recently tweeted that companies should stop having “strategies” and instead they should have “hypothesis”
- Strategy implies I know this to be true and I will run through a wall to validate it, hypothesis is here is what we think is true, here is how we are going to test it and here is how we will know it is true, it also promotes more experimentational attitude
- She tells a funny story about how she gives talks and leaves off how the hands ended. she writes: “…I had left them teetering on the edge of a cliff. ‘Wait! How did the hand turn out?’ I gave them the red pill: ‘It doesn’t matter’ ” brilliant writing! 
- Mentions the importance of having diversity of opinion in a group 
- I generally agree but it is not a “more is good” thing, the trick is finding the inflection point:
- In an organization you want to have diversity of opinion so that people can challenge and improve their ideas, but you actually want a small degree of groupthink where everyone is passionately on the same page, because I often find groups where people agree with each other and are on the same page, they also tend to move faster. The trick is finding that balance.
- Funny David Letterman story, lesson for me because I have a tendency to give unsolicited advice, I need to do a better job of asking for permission first, she calls it a “social contract” [page number tk]
- RED team, and state State Department Dissent Channel [page number? tk]
- Some very practical strategies of truth seeking:
- Express Uncertainty: If you say “I might be wrong…” or “I’m not sure” it gives people an opportunity, “social permission” to share useful information with you that they may otherwise have held back 
- Lead with Assent: when expressing a difference of opinion, start with what you agree with then say “AND…” not “BUT…” this prevents people from becoming defensive 
- My addition is to Also, try to intone your voice in such a way that people are not waiting for the shoe to drop. I’ve noticed there is a certain tone in which we tend to talk that you know even if someone is agreeing with you there is a but coming, so you’re already getting emotional and defensive while they are talking. I’m not sure what the solution to this is but by being observant that it exists, that is one step towards solving it.
- Ask For Temporary Agreement in Truth-Seeking: If someone is emotionally offloading to us, ask if they just want to vent or are looking for advice for what to do next. She suggests the phrase. “Do you want to just let it all out [and talk about it] or are you looking for what to do next?” 
- I have a habit of giving unsolicited advice and this is a reminder for me that I should probably get consent before I give my advice
- The future hasn’t happened yet so try to focus conversations on that, when you focus on the past, people get too hung up on creating narratives to rationalize what happened 
Adventures in Mental Time Travel
- She suggests using past and future versions of yourself to make better decisions
- Very profound quote on how we should think about happiness as a long term stock that steadily rises and pay less attention to the day-to-day fluctuations. Very cool graph of Berkshire long-term vs short-term stock price to illustrate this .
- Ulysses Contract: Name after Odysseus and the siren odyssey. Where he put wax in his ears before he went on a voyage because he knew he would get distracted by siren songs. Make a plan to account for future decision-making hazards .
- E.g. When I am tired, I open youtube, I start clicking on endless related videos so I installed a Youtube recommendation blocker that blocks recommended videos:
- This might be a new word for an old concept: Create barriers for things that tempt you
- Very pragmatic advice of sometimes you just need to vent and complain. Even though it’s arguably irrational, it is normal to want to emotionally offload, it can help you relax.
- Decision swear jar: everytime you use a word or actions that is usually a signal that you are doing something irrational:
- overconfidence: “I am 100% sure”
- Irrational outcome Fielding: “I got unlucky” or “I planned it perfectly”
- Scenario planning, when making a decision, think of all possible outcomes and their rough probabilities. I can’t tell if this is insightful or obvious 
- Data analysis of Pete Carrol’s decision
- Backcasting and premortem, imagine the future already happened. How did you get here/there?
Overall, Thinking in Bets is a very information dense and well written book. I would strongly recommend for anyone trying to get better at making decisions when you don’t have all the information. Actually, I would just recommend it for anyone who wants to understand human behavior better, which should be everyone! ?