I recently finished reading Thinking in Bets, by Annie Duke. The world is in a critical situation 🦠, and what decisions we make in our personal and professional lives right now could be vital in how the future pans out for us. I decided to pick this book up to see if I can gather some ideas that can help me navigate my life better.
Here are the ideas that I liked and would like to remember. I write them down for, among other reasons, solidifying my learning and making these notes more accessible to me (I am obsessed with my blog and check it every
once in a while hour 🙄).
A bet is a decision about an uncertain future. Treating decisions as bets lets you avoid common decision traps, learn from results in a more rational way, and keep emotions out of the process as much as possible.
Explicitly calling our decisions bets lets us anticipate and prepare for situations when irrationality is likely to keep us from acting in our best interest.
“Decision quality” and “Result quality” are different. We have a tendency of judging the decision based on the outcome. If the outcome was good, it was a good decision, and if the outcome was bad, it would have been a bad decision. This is called Resulting.
Most of the times it will be true, the outcome and the decision will be related to each other. But still, I found it to be a good mental model to have in your kitty. Sometimes we might evaluate a decision to be wrong because the result was bad, but the result could have been bad because of sheer luck as well. That doesn’t mean we should not take that decision again. Breaking down the decision process into Decision and Result quality paves tha path for more accurate analysis of your decisions, and can allow you to learn more from them.
If someone says “Wanna bet?” to you when you express a belief, it immediately makes you question your sources, your confidence on the belief, etc. This can be used as a tool to elicit skepticism.
While it is not possible to go around saying “Wanna bet?” to people, you can make it a habit to challenge yourself internally before forming an opinion.
Accepting that everything doesn’t just have to be a 0 or 1, or black or white, can help you open yourself to more learning and feedback. Once we form a belief, we start seeing everything using the lens of that belief. Things that solidify our belief are filtered in, and everything else is looked over. Staying in the grey can make you more receptive to new ideas.
Additionally, if you admit that “you are not 100% sure” about something, you signal to people that you are open to new ideas. People generally don’t want to get into arguments, but when they sense that you understand that you might not be “obviously right”, they feel more confident in collaborating with you. This can ultimately lead to you getting more honest feedback and deriving more learning from the discussion.
We can’t just “absorb” experiences and expect to learn. As novelist and philosopher Aldous Huxley recognized, “Experience is not what happens to a man; it is what a man does with what happens to him.” There is a big difference between getting experience and becoming an expert. That difference lies in the ability to identify when the outcomes of our decisions have something to teach us and what that lesson might be.
You can’t learn from experience if you tag your decisions incorrectly. To make things more difficult, we have a “self-serving bias”. We take credit for the good stuff and blame the bad stuff on luck so it won’t be our fault. And, unsurprisingly, when it comes to other people, bad outcomes are their fault, and good outcomes are results of luck.
As artist and writer Jean Cocteau said, “We must believe in luck. For how else can we explain the success of those we don’t like?”
Again, thinking about outcome fielding as a bet can help. “I think that this outcome was because of my skill (or luck). Ok. Wanna bet?”
Truthseeking, the desire to know the truth regardless of whether the truth aligns with the beliefs we currently hold, is not naturally supported by the way we process information. We might think of ourselves as open-minded and capable of updating our beliefs based on new information, but the research conclusively shows otherwise. Instead of altering our beliefs to fit new information, we do the opposite, altering our interpretation of that information to fit our beliefs.
Not every situation is appropriate for truthseeking, nor is everyone up for it all the time. Before you indulge in truthseeking with someone, make sure the other person wants it as well. Or you will end up hurting people or getting hurt yourself (when they pelt stones at you).
CUDOS stands for Communism, Universalism, Disinterestedness and Organized Skepticism
Communism (data belong to the group), Universalism (apply uniform standards to claims and evidence, regardless of where they came from), Disinterestedness (vigilance against potential conflicts that can influence the group’s evaluation), and Organized Skepticism (discussion among the group to encourage engagement and dissent).
Communism: More is more. While evaluating a decision, get all the information out there, anything that seems even remotely relevant.
- “As a rule of thumb, if we have an urge to leave out a detail because it makes us uncomfortable or requires even more clarification to explain away, those are exactly the details we must share. The mere fact of our hesitation and discomfort is a signal that such information may be critical to providing a complete and balanced account.”
Universalism: Acceptance or rejection of an idea must not “depend on the personal or social attributes of their protagonist.” “Don’t shoot the message,” for some reason, hasn’t gotten the same historical or literary attention, but it addresses an equally important decision-making issue: don’t disparage or ignore an idea just because you don’t like who or where it came from.
- When we have a negative opinion about the person delivering the message, we close our minds to what they are saying and miss a lot of learning opportunities because of it. Likewise, when we have a positive opinion of the messenger, we tend to accept the message without much vetting. Both are bad.
Disinterestedness: People have conflict of interest, and that may come in the way of fielding an outcome into skill or luck. Telling someone how a story ends encourages them to be resulters, to interpret the details to fit that outcome. Make it a habit when seeking advice to give the details without revealing the outcome.
- “If I won a hand, it was more likely my group would assess my strategy as good. If I lost, the reverse would be true. Win a case at trial, the strategy is brilliant. Lose, and mistakes were made. We treat outcomes as good signals for decision quality, as if we were playing chess. If the outcome is known, it will bias the assessment of the decision quality to align with the outcome quality. If the group is blind to the outcome, it produces higher fidelity evaluation of decision quality. The best way to do this is to deconstruct decisions before an outcome is known. Attorneys can evaluate trial strategy before the verdict comes in. Sales teams can evaluate strategy before learning whether they’ve closed the sale.”
- Organized skepticism - “Skepticism is about approaching the world by asking why things might not be true rather than why they are true. It’s a recognition that, while there is an objective truth, everything we believe about the world is not true. Thinking in bets embodies skepticism by encouraging us to examine what we do and don’t know and what our level of confidence is in our beliefs and predictions. This moves us closer to what is objectively true.”
Time Travel: We have a tendency to favour the needs of our present-self at the expense of our future-self. This is called Temporal Discounting. We are willing to take an irrationally large discount to get a reward now instead of waiting for a bigger reward later. We are more likely to be irrational and impulsive while taking in-the-moment decisions. This is where “Time travel” fits in. It’s not a new concept, just a different way of putting it.
- “We can interrupt an in-the-moment decision and take some time to consider the decision from the perspective of our past and future. We can then create a habit routine around these decision interrupts to encourage this perspective taking, asking ourselves a set of simple questions at the moment of the decision designed to get future-us and past-us involved. We can do this by imagining how future-us is likely to feel about the decision or by imagining how we might feel about the decision today if past-us had made it.”
- Another famous tool here is Suzy Welch’s 10-10-10 tool. “Every 10-10-10 process starts with a question… . What are the consequences of each of my options in ten minutes? In ten months? In ten years?”
Figure out the possibilities, then take a stab at the probabilities:
- “When faced with highly uncertain conditions, military units and major corporations sometimes use an exercise called scenario planning. The idea is to consider a broad range of possibilities for how the future might unfold to help guide long-term planning and preparation.” After identifying as many of the possible outcomes as we can, we want to make our best guess at the probability of each of those futures occurring.