Book Review: Thinking in Bets
At a suggestion from Andre de Verteuil I read Thinking In Bets: Making Smarter Decisions When You Don't Have All The Facts. Overall it was a good and useful read, and it took me a long time to read. In part this book suffers from the problem of many such works: it has too many words. Whole paragraphs can be skimmed or skipped without diminishing any takeaways. The other challenge, for me, was that I'm not a gambler. The author is a professional poker player, and resorts to many poker-related anecdotes that simply didn't resonate with me. The rules and mechanics of poker aren't dealt with much, thankfully, but the premise of "betting" as a framing device for decision making comes from her experience and history making professional bets in poker games. This made it really hard for me, personally, to connect with parts of the book.
With those complaints out of the way, it's also important to highlight that I didn't take away any specific strategies for making better decisions. There's no easy decision making formula that is generically applicable to all situations. But thinking about decision making, and the feedback loop of decision → outcome → analysis → decision is helpful.
The author opens with an explanation of resulting: the tendency to equate the quality of a decision with the quality of its outcome. She offers several good examples, and it was a powerful way to start the book. There is a very big distinction between the quality of a decision and the quality of that decision's outcome. Well informed, thoughtful decisions may justifiably be described as "the right decision" even if the outcome is other than what was desired.
I ask group members to come to our first meeting with a brief description of their best and worst decisions of the previous year. I have yet to comes across someone who doesn’t identify their best and worst results rather than their best and worst decisions.
Closely related to the issue of resulting is hindsight bias: the tendency, after an outcome is known, to see the outcome as having been inevitable. Later in the book she expands upon this to observe that once something occurs, we no longer think of it as probabilistic - or ever having been probabilistic.
The first step in improving our decision making is getting comfortable with not having all the facts. “I don’t know” is not a failure but a step toward enlightenment. What makes a great decision is not that is has a great outcome. A great decision is the result of a good process, and that process must include an attempt to accurately represent our own state of knowledge.
When we accept that we can’t be sure, we are less likely to fall into the trap of black-and-white thinking. For most of our decisions there will be a lot of space between unequivocal “right” and “wrong.”
Once a belief is lodged, it becomes difficult to dislodge. It takes on a life of its own, leading us to notice and seek out confirming evidence, and ignore or work hard to actively discredit information contradicting the belief. This irrational, circular information-processing pattern is called motivated reasoning. The way we process new information is driven by the beliefs we hold, strengthening them.
If we think of our beliefs as only 100% right or 100% wrong, when confronting new information that might contradict our belief, we have only two options: (a) make the massive shift in our opinion of ourselves from 100% right to 100% wrong, or (b) ignore or discredit the new information. It feels bad to be wrong, so we choose (b). Information that disagrees with us is an assault on our self-narrative.
We would be better served as communicators and decision-makers if we thought less about whether we are confident in our beliefs and more about how confident we are. Instead of thinking of confidence as all-or-nothing, our expression of our confidence would then capture all the shades of grey in between.
We are less likely to succumb to motivated reasoning since it feels better to make small adjustments in degrees of certainty instead of having to grossly downgrade from “right” to “wrong.” When confronted with new information, it is a different narrative to say “I was at 58% but now I’m at 46%.” That doesn’t feel nearly as bad as “I thought I was right but now I’m wrong.”
Experience is not what happens to a man; it is what a man does with what happens to him.
-- Aldus Huxley
"Any decision ... is a bet on what will likely create the most favorable future for us."
Self-serving bias: we take credit for the good stuff and blame the bad stuff on luck so it won’t be our fault.
Where we blame our own bad outcomes on luck, when it comes to our peers, bad outcomes are clearly their fault. While our own good outcomes are due to our awesome decision-making, when it comes to other people, good outcomes are because they got lucky.
We must believe in luck. For how else can we explain the success of those we don’t like?
-- Jean Cocteau
Thinking in bets triggers a more open-minded exploration of alternative hypotheses, of reasons supporting conclusions opposite to the routine of self-serving bias. We are more likely to explore the opposite side of an argument more often and more seriously - and that will move us closer to the truth of the matter.
Having a group of people to help you explore and improve your decision making process can be a good idea. Any such support group needs a clear charter:
- Focus on accuracy (over confirmation), which includes rewarding truthseeking, objectivity, and open-mindedness with the group
- Accountability, for which members have advance notice
- Openness to a diversity of ideas
“People are more willing to offer their opinion when the goal is to win a bet rather than get along with people in a room.”
“Accuracy, accountability and diversity wrapped into a group’s charter all contribute to better decision making, especially if the group promotes thinking in bets.”
The book spends some time exploring the Mertonian Norms and how various scientific communities have employed thinking in bets to improve the process of science. The norms are codified by the acronym CUDOS: communism, universalism, disinteredness, organized skepticism
- Communism “... I’ve encouraged companies to make sure they don’t define ‘winning’ solely by results or providing a self-enhancing narrative. If part of corporate success consists of providing the most accurate, objective, and detailed evaluation of what’s going on, employees will compete to win on those terms.”
- Universalism “Don’t shoot the message because you don’t like the messenger.”
- Disinterestedness “Outcome blindness” - if the group is blind to the outcome, it produces higher fidelity evaluation of decision quality.
- Organized Skepticism “True skepticism isn’t confrontational.”
“We need to be particularly skeptical of information that agrees with us because we know that we are biased to just accept and applaud confirming evidence.”
“Organized skepticism invited people into a cooperative exploration. People are more open to hearing differing perspectives expressed this way.”
We can improve our communication with others:
- Express uncertainty: audience is more likely to understand that any discussion that follows will not involve right versus wrong.
- Lead with assent: listen for things you agree with, state those and be specific, and then follow with “and” instead of “but”.
While I complained at the beginning of this post that the book doesn't give much practical advice for actually making decisions, re-reading my notes suggests that maybe I was overly critical. Several techniques are proposed, all of which are good ideas. First, think about potential negative outcomes that might result from your decision. This helps you move regret to the front of decisions, instead of after. This helps us avoid making impetuous decisions for short-term benefits that might have deleterious long-term results.
Suzy Welch suggests "10-10-10": ask yourself what are the consequences of each of my options in ten minutes? Ten months? Ten years? The book expands on this to suggest asking how would I feel today if I had made this decision ten minutes ago? Ten months ago? Ten years ago?
The author introduces the notion of backcasting: identify the goal and work backwards to “remember” how we got there. Imagine we’ve already achieved a positive outcome, holding up a newspaper with the headline “We achieve our goal!” Then we think about how we got there.
Backcasting makes it possible to identify when there are low-probability events that must occur to reach the goal. That could lead to developing strategies to increase the chances those events occur or to recognize the goal is too ambitious.
A similar idea is a premortem: working backward from a negative future. Imagining a headline that reads “We failed to reach our goal” challenges us to think about ways in which things could go wrong that we otherwise wouldn’t if left to our own devices.
...incorporating /negative visualization/ makes us more likely to achieve our goals.
...we need to have positive goals, but we are more likely to execute on those goals if we think about the negative futures.
All in all, Thinking In Bets was a worthwhile read. I have some new ways to think about decisions, and hopefully I can be more aware of the various biases that challenge good decision making. It would have been nice if the book was a little shorter.