The Undoing Project by Michael Lewis describes the friendship and collaboration between psychologists Daniel Kahneman and Amos Tversky. As described in the book, they grew up together in Israel and worked in America when they started their collaboration. Their friendship spanned decades until Tversky’s death from cancer in 1996 at the age of 59.
The book provides insight into the minds of two men who were among the most influential psychologists in history. Their work led to fundamental changes in how modern society perceives human rationality. This article summarizes nine lessons learned from their ground-breaking research, which focused primarily on cognitive biases that often result in faulty decisions.
1. We are not the rational beings we believe ourselves to be
Our decisions are often influenced by many factors that are unrelated to the decision itself. Kahneman and Tversky’s research identified several cognitive biases called “heuristics” (knowledge gained through experience). We don’t make decisions based on pure rationality. Instead, our brains rely on heuristics to save time and energy whenever possible.
For example, when judging whether someone is an “attractive” or “unattractive” person, you rely on the first piece of information (typically appearance) when deciding quickly. However, you might make a different decision if you had more time to gather additional data (i.e., longer observation or additional conversation).
Cognitive biases don’t make us irrational; they make us human. They allow our brains to function efficiently and effortlessly (much like an autopilot) even when we’re facing difficult decisions. However, this often results in faulty decision-making. Therefore, we must remember that cognitive biases exist and account for them whenever possible.
2. We don’t make decisions based on objective facts
In the book, Kahneman describes a study from his work with Tversky investigating how we use information. Participants were told about a young woman who was very bright and deeply concerned with issues in Africa. They were then asked to give their personal opinion on whether they believed she was more likely to be a “banker” or a “bank teller.”
Most participants assumed she is more likely to be a banker because she sounds better educated, and the term banker has a positive connotation. However, participants should have determined that she is more likely to be a bank teller because it requires less education, and it’s more common than being a banker.
When it comes to decision-making, we often focus on the facts that reinforce our opinion and ignore those that contradict it. Kahneman calls this tendency “myside bias.” Unfortunately, we tend to believe that others are just like us, which means they view their options with myside bias too. Therefore, it’s important to always challenge your beliefs by looking for evidence that contradicts them.
3. We need to make decisions quickly
Our brains rely on heuristics because they take less time and energy than more comprehensive alternatives, which often results in flawed decision-making. Kahneman and Tversky realized that our reliance on heuristics was problematic when it led to decisions that you could have avoided with more time and effort.
To solve this dilemma, they identified “slow thinking” as the alternative to heuristics. A slower, more deliberate approach allows you to consider all factors instead of rushing to a conclusion. You can avoid irrelevant information by asking questions about the problem and committing yourself to an answer before getting any answers from outside sources.
Although “slow thinking” is a more effective decision-making strategy, it’s not always an option because we don’t have the time or energy to analyze every decision that comes our way thoroughly. Your best bet is to find a good balance between heuristics and slow thinking. You should only use heuristics when quick decisions are necessary and rely on slow thinking as much as possible.
This concept is relevant to UX design because user experience requires a lot of decision-making. We make choices about design, content, interactions, etc., each day. If we wait too long to make these decisions, the opportunity may pass us as new information about users becomes available. However, rushing to conclusions can lead to flawed decision-making too.
Kahneman suggests treating heuristics as a “rule of thumb” whenever possible. You should carefully consider all factors before making a decision but recognize that you will have to make compromises at some point in the process.
4. We mistake accuracy for truth
When we decide how to act, we often use past experiences as evidence of what works and what doesn’t. For example, if I got a speeding ticket last week while driving through a school zone, I might avoid it this week because I know the area is dangerous. This approach makes sense when the situation hasn’t changed much since the past, but it can be problematic when conditions have changed.
Our tendency to rely on past experiences causes us to mistake accuracy for truth because we aren’t considering whether the evidence is relevant anymore. Kahneman describes this as “the illusion of validity” because it makes us think our opinion or decision is sound even though you can’t ignore facts.
For example, a person’s experience with a particular subject doesn’t mean they’re an expert on that subject today. They may have learned about the topic in the past, but circumstances have changed since then. So even though we may consider someone an expert because of their previous experiences, it doesn’t mean their knowledge is completely relevant anymore.
5. We think we understand probabilities better than we do
When making predictions about uncertain events, our perception of probability is often worse than random chance and skewed in a way that reinforces existing beliefs.
We tend to use the availability heuristic when thinking about probabilities because it’s easier to recall examples that fit with what we already know instead of objectively weighing all options.
You should avoid this trap by thinking about probabilities as possibilities instead of predicting specific outcomes. In addition, be mindful of your availability heuristic and check for confirmation bias in others.
6. We let emotion cloud our judgment
Since we’re bad at making rational decisions under pressure, we should avoid putting ourselves in stressful situations when possible. However, that’s easier said than done because most work environments involve some degree of stress. Therefore, it’s important to recognize emotions as they’re happening so you can take a step back and evaluate the situation.
If you find yourself in an emotionally charged situation at work, think about what’s causing your emotion before reacting. Ask yourself whether you’re focusing on both positive and negative aspects of what you hear or just one part of it. Before deciding what to do, assess whether there are facts that contradict your opinion. As well as how comparable circumstances have played out in the past.
7. We become more optimistic when we’re tired
Our judgment is impaired when we’re overly fatigued because our brain cells need constant stimulation to stay active. As Kahneman points out, this “is why people make stupid mistakes in the early hours of the morning or after a night without sleep.”
This impairment causes us to make decisions we wouldn’t normally make because we think we understand events better than we do and give too much weight to first impressions. So, for example, if you meet someone new and they say something offensive within the first twenty minutes, you’re likely to give their behaviour more weight because it’s the first thing you’ve heard them do.
Since you shouldn’t make important decisions when tired, be mindful of your energy levels throughout the day and recognize whether fatigue is affecting your ability to think clearly. Remember that making rash decisions when stressed or tired can be avoided by taking a step back and engaging in some self-reflection.
8. We focus on one aspect of a story at the expense of others
This cognitive bias, known as “confirmation bias,” is when we seek out information that supports our opinions and ignore evidence to the contrary. This behaviour makes it easier for us to make decisions but clouds our judgment of more complex problems.
Confirmation bias can cloud our judgment in many different areas; however, the most common example of this cognitive trap is discussing politics with friends and family. People on both sides of the aisle like to seek out sources that agree with their views while ignoring information provided by people who disagree with them. This behaviour doesn’t allow them to gain a holistic view of the situation, making it difficult to implement solutions.
The best way to avoid confirmation bias is by seeking out people who disagree with you when looking for new information. This can be done when reading about an important issue online or discussing politics in your social circles. For example, if you read an article about gun control and you generally agree with it, think about how the opposite argument would sound before accepting the article as fact.
9. Bad decisions can become good decisions over time
The ultimate lesson provided by Kahneman and Tversky’s (2000) research is that “no decision is optimal or correct in an absolute sense; all we can hope to do is avoid errors.” To use their example, the risk of alienating a friend for life is high when you have an affair with their partner but low when you accidentally tell them that you hate their new shirt. Of course, both decisions are bad, but they’re judged very differently depending on the situation.
This isn’t only true when it comes to moral dilemmas; we make bad decisions in all areas of our lives and must be willing to change them if necessary. For instance, you might fail a test because of how your schedule was organized or forget where you put something at home because you didn’t put up a designated location.
These types of mistakes are unfortunate, but they’re unavoidable. You can’t change the past, but you can analyze your decisions and do everything in your power to make sure similar errors don’t happen again. When you fall victim to confirmation bias or act impulsively, understand those bad decisions aren’t necessarily permanent. You can choose to learn from your mistakes and adjust your behaviour accordingly.
In a Nutshell: The Undoing Project by Michael Lewis
Kahneman and Tversky’s research suggests that we’re not as rational as we think we are when it comes to decision making. For example, we make rash decisions when tired, ignore evidence to the contrary, and give too much weight to first impressions.
This is important because we don’t always consider the cognitive traps that affect our decision-making skills. However, with some self-reflection and outside help (i.e., people who disagree with you), we can reduce the effect of these mistakes and make better decisions in general.