Recognizing your biases is the first step to improving patient care
A patient arrives presenting with fever, fatigue and joint pain. Based on the symptoms, you suspect the flu. The patient tells you that someone else in the household just had it, which supports your initial opinion. Moreover, it’s the middle of flu season. And the last seven patients you’ve seen with the same symptoms all had the flu.
Here's a possible diagnosis, but it’s one for the doctor, not the patient: a case of cognitive biases.
In medicine, or any other field, knowledge counts. Beyond what you know, it’s how you think that matters.
Looking for evidence that supports your diagnosis rather than refutes it? That’s a confirmation bias. Locking in on salient features in the patient’s initial presentation early in the process? That’s an anchoring bias. Thinking that something is more likely or occurs more frequently because it readily comes to mind? That’s an availability bias.
The same cognitive biases are at play in everyday life, for all sorts of choices we make and perceptions we have.
We tend to gravitate to people who are similar to us in beliefs and values, because they feel more agreeable (confirmation bias). When shopping, the first price we see becomes our point of comparison, so then prices below become a deal while prices above are a rip-off (anchoring bias). Or many of us fear flying, because we recall news stories of rare plane crashes more easily than stories of common car crashes (availability bias).
Having a cognitive bias doesn’t make you a bad person or a bad doctor. It makes you human, says Dr. Pat Croskerry, a retired ER doctor and cognitive psychologist in Halifax, who has authored several journal articles on the topic.
“We’re all biased,” he states. “We have vulnerabilities towards particular ways of thinking.”
In medicine, acting on a cognitive bias doesn’t automatically mean you’re wrong. Odds are, the patient under the weather really does have the flu. Still, they could possibly have meningitis, pneumonia, or something else that requires more investigation.
When the stakes are big, the risk is that biases can lead you astray, like taking the wrong exit on a highway. Maybe you’ll get back to where you ought to be, or maybe you’ll be lost on a whole other route.
“All biases can potentially have an impact on patient safety and patient care,” says Dr. Armand Aalamian, Executive Director, Learning at the Canadian Medical Protective Association (CMPA).
Studies have shown a link between cognitive biases and medical errors, from diagnoses to the operating room. As one UK study concluded, “It is likely that most, if not all, clinical decision-makers are at risk of error due to bias.”
For the sake of clinical effectiveness and patient well-being, it’s important to understand the nature of cognitive biases and how to mitigate them.
Mental shortcuts can be efficient – but aren’t always accurate
Essentially, cognitive biases are heuristics, which are mental processes used for quick decision-making. They help us manage information efficiently, if not always 100 percent accurately. “You have to be mindful that heuristics are shortcuts,” says Dr. Aalamian.
The CMPA, in a learning module called “Clinical Reasoning: The Impact of Bias”, notes that we use heuristics unconsciously, based on prior experience, pattern recognition, intuition and our perceptions. In complex environments that require frequent decisions, like in medicine, heuristics are invaluable.
The same sort of thinking that can be helpful can also be a stumbling block, points out Dr. Christopher Wallis, a urologic oncologist and an Assistant Professor, Division of Urology, Temerty Faculty of Medicine, University of Toronto. He notes that medical training centres on recognizing patterns, and that a cognitive bias can “simplify reality”.
The term bias carries baggage, but it’s just a cognitive train of thought, says Dr. Angela Jerath, a cardiac anesthesiologist and an Associate Professor, Anesthesia & Pain Medicine, Temerty Faculty of Medicine, University of Toronto. “It’s just how we function, and shouldn’t be put in a negative light.”
But that’s understandable. We often think of biases in terms of discrimination towards certain groups. That’s one type. And given that connotation, it’s no wonder that many people can object to the idea that they have cognitive biases of any sort.
“Biases have a social loading to them, because people think you’re talking about stereotypes. But a bias is a predilection, something that’s getting more of your attention than it deserves. It’s part of the architecture of the brain,” says Dr. David Hauser (Ph.D), Assistant Professor of Social/Personality Psychology at Queen's University.
“We spend 95 percent of our time in a heuristic mode,” adds Dr. Croskerry. “A lot of things are a given, and you can make assumptions correctly most of the time. Now and again, they’re not correct, and that’s when something is labelled as a bias. When we’re tired, overworked or cognitively overloaded, these biases tend to be more prevalent.”
They can become cognitive traps. Peel back where things go wrong, and there’s often a bias at the core. Dr. Croskerry once compiled a list of 50 for Dalhousie University’s medical training. The anchoring, availability and confirmation biases mentioned at the top were among them, but here are 10 more:
- Diagnosis momentum: Once diagnostic labels are attached to patients they tend to become stickier, until they become definite and all other possibilities are excluded.
- Premature closure: The tendency to accepting a diagnosis before it has been fully verified, accounting for a high proportion of missed diagnoses.
- Unpacking principle: A failure to elicit all relevant information in establishing a differential diagnosis. That can happen if you limit patients in their history-giving or yourself in history-taking, so unspecified possibilities are discounted.
- Search satisfying: A tendency to call off a search once a plausible solution is found, which can lead to missing things like co-morbidities, other fractures, co-ingestants in poisoning, etc.
- Framing effect: How you see things may be influenced by how an issue is framed. A perception of risk to the patient may be swayed by the prognosis. Or, one drug or treatment option could be considered superior depending on how its possibilities are framed (e.g., 95 percent success rate vs. 5 percent failure rate).
- Fundamental attribution error: The tendency to be judgmental and blame patients for an illness rather than examine the circumstances that may be responsible.
- Sunk costs: The more clinicians invest in a diagnosis, the less likely they may be to release it and consider alternatives. Think of how financial investors might be reluctant to dump a sinking stock, as that will lock in a loss. In the case of a diagnostician, the precious investment could be time, mental energy and ego.
- Representativeness restraint: This drives the diagnostician towards looking for prototypical manifestations of disease, missing atypical variants.
- Aggregate bias: A belief that aggregated data, like those used to develop clinical practice guidelines, don’t apply to individual patients (especially your own). Presuming that your patients are atypical or exceptional may lead to errors of commission, e.g., ordering unnecessary tests.
- Overconfidence bias: A belief that we know more than we do. That can play out by placing too much faith in opinion rather than evidence, or acting on incomplete information.
There are a few hundred cognitive biases, which influence everyone in their personal and professional lives. Have any of the above biases ever crept into your work? If not, you may be under the influence of another major bias: blind spot bias. That’s the belief that you’re somehow less susceptible to bias than your peers.
Strategies to mitigate biases
Becoming more resistant to cognitive biases hinges on recognizing the potential for such thinking. It’s not always easy. Many people are reluctant to acknowledge their biases, and are often even unaware they hold a particular one (or any).
Dr. Croskerry analyzes medical decision-making and says certain dispositions can help you to prevail against biases. If you’re naturally sceptical, he says there’s less of a chance of biases breaking through.
But start by assuming that you, like every other human on the planet, have unconscious biases as a normal operating characteristic. “If you have a brain, you have a bias,” says Dr. Aalamian.
So learn about them, and how and when they can manifest in a medical practice. Then, says Dr. Hauser, you can better reflect on which biases might apply to particular situations, or when you’re more prone to biased thinking.
De-biasing yourself isn’t about eliminating cognitive biases from your brain, or even consciously noticing when they’re at play, says Dr. Croskerry. It’s a matter of knowing that you have these biases.
With that awareness, you can challenge your initial assumptions. So gather more information. Develop a differential diagnosis. Explore other possibilities. Consider the worst case scenario. Reconsider your reasoning.
“I try to teach trainees when to go fast and when to go slow,” says Dr. Wallis. He says cognitive biases are like an autopilot; you have to know when to turn it off.