Published: May 4, 2018

Cognitive and implicit bias as barriers to optimal patient management

Many factors influence how we collect, integrate, and interpret information to make clinical decisions. Some of these factors, such as our training, experience, and education, affect our conscious thought processes. Others work more subtly to push our thought processes in directions that may not always be in patients’ best interests.


From the AAO-HNSF PSQI Committee

Karthik Balakrishnan, MD, MPH; Emily F. Boss, MD, MPH; C.W. David Chang, MD


Bias BarriersMany factors influence how we collect, integrate, and interpret information to make clinical decisions. Some of these factors, such as our training, experience, and education, affect our conscious thought processes. Others work more subtly to push our thought processes in directions that may not always be in patients’ best interests. These latter cognitive processes are called unconscious biases, and they can have a profound effect on our medical decisions and the quality of care we provide. They are present in every clinician’s mind, and they are in play in every clinical interaction and decision encountered by that mind.

While cognition is an incredibly complex realm, one useful framework to understand it was proposed by Daniel Kahneman and Amos Tversky several decades ago (Tversky and Kahneman, 1974). Briefly, they suggested that the mind has two main pathways for information processing. Process 1 is a rapid, unconscious pathway that categorizes, weighs, and interprets new information through a variety of shortcuts called heuristics. For example, the availability heuristic compares new situations to examples that are easily recalled from memory in order to decide how to respond to these new situations. Process 2 involves conscious and deliberate study and situational analysis, carefully weighing evidence, possible explanations, and potential outcomes.

Process 1 can be used very effectively, particularly by experienced master clinicians who seem to reach correct diagnoses and treatment plans in an instant, and often cannot explain how they did this. On the other hand, Process 1 heuristics might be applied inappropriately by inexperienced clinicians, leading to diagnostic or treatment decision errors. For example, an on-call junior resident may be deciding whether a patient has invasive fungal sinusitis. The resident may have only seen that disease once before, so quick comparison of the new patient to the previously experienced case may not lead to a correct diagnosis, especially if the cases do not match closely. Why do these two clinicians differ? Due to greater experiential history, the master clinician is often better able to distinguish relevant information from “red herring” information that can be disregarded. Process 1 tends to be applied most often in situations of pressure, time constraint, or distraction. Compounded by the ever-present danger of making decisions based on incomplete information, inappropriate Process 1 thinking can make physicians—even experienced physicians—potentially vulnerable to errors.

The junior resident should probably be using Process 2 thinking. Instead of overly relying on single case experience, the resident would do better reviewing the imaging, examining the patient, looking up information in textbooks and journal articles, and weighing alternate differential diagnoses before coming to a final diagnosis and treatment plan. This deliberative, conscious pathway is much more time- and energy-intensive than Process 1, making it less convenient in high-pressure or time-limited situations. Process 2 is often used by the mind when no obvious situations are available for comparison via Process 1.

Cognitive and implicit biases occur when the mind applies Process 1 heuristics inappropriately, leading to misinterpretation, misintegration, or misapplication of data. Cognitive biases generally apply to how we use clinical data, while implicit biases color how we use that data through the lens of an individual patient’s personal characteristics, such as age, race, gender, and socioeconomic status. It is important to note that implicit biases are present in all individuals, even those who do not exhibit overt discrimination.Instead, implicit biases are a product of the society in which we grow up. These implicit biases affect not only how we perceive patients but also how we perceive colleagues. Both types of bias—cognitive and implicit—reflect the mind’s inappropriate application of shortcuts to newly available information, leading to assumptions and conclusions that do not truly fit the data at hand.

These biases have clear effects on the medical and surgical care we provide. A recent study of 69 surgical “never events” cumulatively found 628 contributing human factors. Nearly 20 percent were attributed to cognitive errors, with confirmation bias being the most common (Thiels, et al, 2015). Meanwhile, implicit biases have been shown to predict important differences in management between patient groups. A well-designed 2007 study found that clinicians with higher levels of racial implicit bias were less likely to refer black patients with acute coronary syndrome for thrombolysis, even after controlling for physician’s race, self-perceived bias levels, and belief in treatment effectiveness (Green, et al, 2007).

If these biases are present in all of us and color every clinical decision we make, how do we reduce their effects in order to provide the most equitable, high-quality care possible to all our patients? The biggest steps are to become more aware and acknowledging of our biases and to more intentionally engage Process 2 cognition. This was convincingly demonstrated in a secondary analysis performed by Green, et al. When clinicians were made aware of their implicit race biases, the pattern of recommendation for thrombolysis reversed, with higher clinician bias predicting greater likelihood of referring black patients for thrombolysis (Green, et al, 2007).

Increased awareness of cognitive biases—and second guessing oneself about how the information is being processed—appears to reduce the effects of these biases. Increased awareness can be sought in multiple ways. Common “debiasing” techniques include:

  • seeking honest second opinions
  • assigning a “devil’s advocate” for important decisions
  • framing questions correctly so the clinician doesn’t get the “right answer to the wrong question” or miss the bigger picture for the patient
  • “post-settlement negotiation,” in which the clinician or team performs self-review or peer review of key decisions. (Statements provided courtesy of Ellis Arjmand, MD, MMM, PhD.)

To combat implicit bias, most strategies center on increasing empathy and avoiding burnout. These strategies shift the mind from reaching rapid Process 1 conclusions about individuals to more careful Process 2 understanding of others as individuals coming to us with their own unique motivations and circumstances. Approaches include:

  • deliberately taking others’ perspectives
  • pausing before assigning motives to patients or caregivers
  • exposure to groups different from one’s own
  • awareness of the direction and magnitude of one’s own implicit biases, which can be done using freely available online testing (for example at projectimplicit.org).

References

Green AR, Carney DR, Pallin DJ, Ngo LH, Raymond KL, Iezzoni LI, Banaji MR. Implicit bias among physicians and its prediction of thrombolysis decisions for black and white patients. J Gen Intern Med. 2007 Sep;22(9):1231-8. Epub 2007 Jun 27.

Thiels CA, Lal TM, Nienow JM, Pasupathy KS, Blocker RC, Aho JM, Morgenthaler TI, Cima RR, Hallbeck S, Bingener J. Surgical never events and contributing human factors. Surgery. 2015 Aug;158(2):515-21. doi: 10.1016/j.surg.2015.03.053. Epub 2015 May 29.

Tversky A and Kahneman D. Judgment under Uncertainty: Heuristics and Biases. Science, New Series, Vol. 185, No. 4157. (Sep. 27, 1974), pp. 1124-1131.


More from May 2018 – Vol. 37, No. 4