The Pre-brief
The practice of medicine is based in psychology, as much as it is based in knowledge and physical ability. Especially in emergency medicine and critical care, we are faced with making dozens of important decisions every day, often with incomplete information, and usually when we are stressed, tired, or hungry — a perfect storm for faulty decision making. The decisions have important consequences for our patients, their outcomes, and their finances.
In that context, medicine is an imperfect science practiced by well-meaning but fallible humans in a system designed for economic benefit. Whether in the ED or the ICU, providers sometimes rely on heuristics, or mental shortcuts, to make decisions based on their prior experiences, recognized patterns, and old habits. In other situations, if time allows, we spend more time mulling over our options, weighing costs and benefits before we make a determination.
In the 1970s, Amos Tversky and Daniel Kahneman, social scientists in Israel and later the US, founded the field of behavioral economics by positing and testing the idea that humans are irrational. Kahneman won the 2002 Nobel Prize in Economics for their work in the area (Tversky had died in the late 1990s). Perhaps that might not be surprising to you in your line of work, where we and our patients constantly make decisions that are detrimental to our health despite full knowledge of the consequences of our actions. However, at the time, this was groundbreaking because it went against the classically accepted, Keynesian economic principle that individuals are rational actors in a rational system.
The research of Tversky and Kahneman is summed up in Kahneman’s 2011 best seller, “Thinking Fast and Slow,” where he proposes a simple way to explain the processes of our mind, the dual-process theory. This is based on the idea of two “systems.” System 1 represents fast-twitch, unconscious intuitive reasoning while System 2 is conscious, analytical, and methodical. System 1 is prone to cognitive biases not only due to lack of intelligence, but also due to lack of proper reasoning. System 2 seeks to prevent these cognitive errors and biases, but since we are flawed beings in a faulty system, they inevitably occur.
The tendencies to err occur for various reasons, including the aforementioned heuristics, the limited cognitive bandwidth of the brain, environmental and social influences, our emotional and moral motivations, and challenges in remembering or recalling information and memories.
To prevent these errors in cognition, we must know what they are. Wikipedia contains an entry that lists over 185 cognitive biases, and Design Hacks created the Cognitive Bias Codex, which diagrams these flaws in thought in a simple and beautiful way. It’s worth perusing them for the sake of curiosity and knowledge. Some are very similar, others you might know already, and a few will definitely make you wonder about how you make decisions.
The following list of 25 cognitive biases is worth understanding because they have a direct and impactful effect on problem-solving in medicine and our general well-being in an inherently stressful career. All definitions are quoted from Wikipedia.
1) Base rate fallacy (base rate neglect) – “The tendency to ignore general information and focus on information only about the specific case, even when the general information is more important.”
“Only 4% of people get higher than 260 on Step 1. But I’m smart and I work hard! So my chances are greater.”
No, sorry. Your chances are still 4%.
2) Anchoring bias – “The tendency to rely too heavily, or ‘anchor’, on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject).”
3) Authority bias – “The tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion.”
Classically seen when the President of the United States denounces mask-wearing as the oppression of freedom in the middle of a pandemic. It is also noted in the fluctuations of cryptocurrency price in response to Elon Musk’s tweets.
4) Bias blind spot – “The tendency to see oneself as less biased than other people, or to be able to identify more cognitive bias in others than oneself.”
“I’m not racially biased! In fact, I don’t even see color. I actually read this blog post on cognitive biases so I don’t make those mistakes anyway.”
5) Fundamental attribution error – “The tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior.”
When the intern misses a line: “What an amateur! He’s so sloppy. He definitely needs to practice more. He’s just uncoordinated.”
When the senior misses a line: “The patient’s a hard stick. She has small veins. The nurse positioned the patient incorrectly. The room was too bright.”
6) Groupthink – “The psychological phenomenon that occurs within a group of people in which the desire for harmony or conformity in the group results in an irrational or dysfunctional decision-making outcome. Group members try to minimize conflict and reach a consensus decision without critical evaluation of alternative viewpoints by actively suppressing dissenting viewpoints, and by isolating themselves from outside influences.”
7) Availability heuristic – “The tendency to overestimate the likelihood of events with greater “availability” in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.”
“I read this amazing article on pulmonary embolisms on Critical Care Now last week. This patient is dyspneic so my money is on PE!”
“But she’s febrile, coughing, and her chest x-ray shows a lobar consolidation.”
“Oh no! She has lung cancer too! My aunt died from that.”
8) Confirmation bias – “The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.”
9) Compassion fade – “The predisposition to behave more compassionately towards a small number of identifiable victims than to a large number of anonymous ones.”
10) Conjunction fallacy – “The tendency to assume that specific conditions are more probable than a more general version of those same conditions.”
Here is my variation of the “Linda problem” proposed by Tversky and Kahneman:
Mary is 31 years old, single, outspoken, and very bright. She majored in biology. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in voter registration drives.
Which is more probable?
- Mary is a doctor.
- Mary is a doctor and is active in the feminist movement.
When faced with this problem, most people actually end up choosing option B because it seems to be more representative of Mary. In reality, however, it is less probable for two events or qualities to occur together than it is for either one alone.
11) Dunning-Kruger effect – “The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.”
12) Gambler’s fallacy – “The tendency to think that future probabilities are altered by past events when in reality they are unchanged. The fallacy arises from an erroneous conceptualization of the law of large numbers.”
13) Ambiguity effect – “The tendency to avoid options for which the probability of a favorable outcome is unknown.”
“I’ve never given that antibiotic before, so how about we just do what we always do?”
14) Sunk-cost fallacy – “The phenomenon where people justify increased investment in a decision, based on the cumulative prior investment.”
“Hmmm, the infection is getting worse and the antibiotic isn’t working. Let’s give more of it!”
15) The “hot-hand fallacy” (also known as the “hot hand phenomenon” or “hot hand” is the belief that a person who has experienced success with a random event has a greater chance of further success in additional attempts.
“I have gotten all of my last five intubations on the first pass. I’m on fire!”
16) Google effect – “The tendency to forget information that can be found readily online by using Internet search engines.”
17) Planning fallacy – “The tendency to underestimate one’s own task-completion times.”
18) Peak-end rule – “That people seem to perceive not the sum of an experience but the average of how it was at its peak (e.g. pleasant or unpleasant) and how it ended.”
Me in twenty years: “Oh man, residency was so much fun! Not having a life, lockdown during a pandemic, and those 24 hour shifts in the SICU weren’t so bad.”
19) Spotlight effect – “The tendency to overestimate the amount that other people notice your appearance or behavior.”
20) Outcome bias – “The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.”
My thoughts during a code: “I have no idea what is wrong with this patient. Is that a-flutter or v-tach? Oh man, everyone is looking at me. I have to make a decision!”
Nurse: “Doc? What should we do?”
Me: “Give bicarb? No, give calcium and insulin!”
Nurse: “We have ROSC! And the IStat labs just came back. He had a potassium of 8.5! Nice pick up, doc!”
Me playing it cool: “Yeah, you know, I just had a feeling…”
21) Present bias – “The tendency of people to give stronger weight to payoffs that are closer to the present time when considering trade-offs between two future moments.”
22) Projection bias – “The tendency to overestimate how much our future selves share one’s current preferences, thoughts and values, thus leading to sub-optimal choices.”
Note to self: Don’t go to the grocery store hungry.
23) Selection bias – “The tendency to notice something more when something causes us to be more aware of it, such as when we buy a car, we tend to notice similar cars more often than we did before. They are not suddenly more common – we just are noticing them more. Also called the Observational Selection Bias.”
Perhaps one of the top reasons why hypochondriac medical students visit their doctor. Enlarged lymph nodes, anyone?
24) Semmelweis reflex – “The tendency to reject new evidence that contradicts a paradigm.”
Kind of like when cardiologists refuse to acknowledge the OMI Manifesto and don’t take your patient to the cath lab..
25) Pygmalion effect – “The phenomenon whereby other’s expectations of a target person affect the target person’s performance.”
Which seems to explain why you have a harder time getting the chest tube when the trauma surgery attending is watching you than when it’s your own attending.
The Debrief
- Whether in the ED or the ICU, providers sometimes rely on heuristics, or mental shortcuts, to make decisions based on their prior experiences, recognized patterns, and old habits.
- The field of behavioral economics, founded by the psychologist Amos Tversky and Daniel Kahneman, explores human decision making without the assumption that humans are rational actors.
- System 1 represents fast-twitch, unconscious intuitive reasoning while System 2 is conscious, analytical, and methodical. System 1 is prone to cognitive biases, while System 2 seeks to prevent these cognitive errors.
- The tendencies to err occur for various reasons, including heuristics, the limited cognitive bandwidth of the brain, environmental and social influences, our emotional and moral motivations, and challenges in remembering or recalling information and memories.
- To prevent these errors in cognition, we must know what they are.
References
- Kahneman, D. (2011). Thinking Fast and Slow. Farrar, Straus and Giroux.
- “List of Cognitive Biases.” Wikipedia. https://en.wikipedia.org/wiki/List_of_cognitive_biases
- Tversky, A.; Kahneman, D. (1982). “Judgments of and by representativeness”. In Kahneman, D.; Slovic, P.; Tversky, A. (eds.). Judgment under uncertainty: Heuristics and biases. Cambridge, UK: Cambridge University Press. ISBN 0-521-28414-7.
I know it’s a hard habit to break, children- but Trump has not been President for some time. And is Anthony still wearing two masks? Lotta folks did a lotta stupid, not “based on science” stuff. And still do. Otherwise enjoyed it. As a trauma/acute care surgeon, I would add “ER Bias”: do not believe a single word of what they tell you. Trust no-one, do it yourself, and check it twice