Features

Implicit Bias in Health Care

Reading time: 9 minutes
image_print
Photo a four hospital beds in a large room

When implicit bias comes into play in the provision of health care, the consequences can be life-altering

Within minutes, two women arrived at the same emergency department with chest wounds. Yet, they experienced very different care.

With the first, recalls her physician, “I’m using her first name while talking to her, and ensuring that she’s getting pain meds.” The doctor also had a social worker give the patient frequent updates on the condition of her daughter, who was also injured.

Meanwhile, the second woman with an injury received much less attention. In fact, while she lay there in pain, the ED team spent time debating the logistics of how to admit her, instead of focusing on her care.

“Nobody was being nice to her, not like we were to the other person,” the physician remembers.

Why? The physician suspects it was because the second patient was a transgender woman.

The physician recounted the experience during a roundtable discussion hosted by Ohio State University’s Kirwan Institute for the Study of Race and Ethnicity. He was clearly stymied as to why he and his team members provided disparate care. After all, they were all good people who went into health care for all the right reasons.

But such is the insidious nature of implicit bias, says Dr. Javeed Sukhera, a child and adolescent psychiatrist in London, Ontario, who has researched implicit bias extensively in health-care professionals. Bias, he said, works outside our awareness, without our knowledge and despite our best intentions. Physicians are committed to treating all their patients equally, yet everyone makes nonconscious judgments that can affect how we respond to people who may seem different from us, he said. 

Implicit bias has been defined as attitudes or stereotypes that affect our understanding, actions, and decisions in an unconscious manner. They may even conflict with our declared beliefs and how we see ourselves. 

It is human to make unconscious associations — both positive and negative — about other people based on certain characteristics. Evolutionary psychologists believe these unconscious judgments were developed as a mechanism for survival. Our early ancestors needed to make on-the-spot decisions as to whether the person (or animal) they encountered was likely to be a friend or foe. Centuries later, these ingrained tendencies are a shortcut that our brains rely upon to keep us safe.

“Having a bias is not the same as being racist, sexist, ageist, or opposed to any individual. Biases are a normal part of the human brain’s ability to interpret complex data streams,” writes Dr. Michelle Van Ryn in Protecting Yourself and Your Patients from Implicit Biases, a white paper developed by Diversity Science, a public-benefit company that champions an evidence-based approach to equity and inclusion.

But while these automatic responses enable us to make faster decisions, they can also lead us to make unjustified conclusions.

“As human beings, we want to be logical, fair, and objective,” said Kelly Capatosto, who heads the Race and Cognition division at the Kirwan Institute, “but in reality, we all hold our own preconceived ideas about other people, whether it’s what we learned from our family, our early experiences, or media we consume. All of these sources can be ways we learn our biases over time. Humans can’t be ‘colorblind’ even if we wanted to be.”   

Many of the studies done in the last two decades focused on racial bias, finding that while conscious, explicit attitudes of white individuals towards people of colour have steadily improved over the last few decades, implicit attitudes have not. Researchers have also documented concerning findings related to people who may be unconsciously targeted because of gender, sexual orientation, gender identity, social class, or appearance.  

And when implicit bias comes into play in the provision of health care, the consequences can be life-altering, said Dr. Sukhera. “Our implicit biases can perpetuate health-care disparities for some of our most vulnerable patients … We are making clinical decisions that have a real impact on the way in which care is provided for specific groups, particularly for Black and Indigenous Canadians. This research demonstrates how those biases actually cause significant harm despite our best intentions,” he said. 

Since the 1990s, hundreds of studies have documented health care disparities, impacting virtually every aspect of the medical experience — from evaluations of pain and diagnostic criteria, to treatment plans and patient interactions.  

Some of the examples of implicit bias affecting patient care include:

  • Women are three times less likely to be referred for total knee replacement than men, even when clinically indicated. 
  • Black and Hispanic patients are significantly less likely than whites to receive pain medications. When they do receive analgesics, they are at lower dosages than white patients despite having higher pain scores. 
  • Physicians are less likely to treat suicidal ideation in elderly patients, even though men aged 85 to 89 are at high risk of suicide. 
  • Women are less likely to be diagnosed with chronic obstructive pulmonary disease than men, despite having similar histories and medical examinations. 
  • Specialists in obesity implicitly associated obese people with negative cultural stereotypes, such as “stupid, lazy and worthless.”
  • Lesbian and bisexual women report significantly lower rates of timely pap testing than heterosexual women, with many lesbians stating they have never received a pap test. 

While an implicit bias may be invisible to the holder, one study suggests it is very evident to the target individual. Researchers Jeff Stone and Gordon Moscovitz found that while physicians’ attention is focused on controlling an explicit bias, their implicit attitudes and beliefs “can leak out through non-verbal behaviours, such as eye contact, speech errors and other subtle avoidance behaviours that convey dislike or unease in the presence of minority group patients.” 

The study also found the participating physicians’ sense of how well the interaction unfolded was not the same as either the patients’ or the study’s observers. Because the physicians were guided by their explicit responses only, they reported — erroneously — that the interactions had gone well. “This suggests that implicit racial biases can impact not only a medical professional’s behaviour toward a minority group patient, but also how the minority group patient feels about the interaction with the provider; however, the provider may never come to realise why the patient failed to return or to take the medical advice they were given,” wrote Stone and Moscovitz.  

“We are much better at managing the instinctive effects of our biases if we take care of ourselves.”

The typical clinical encounter, under time constraints and often with the need to make snap judgments, is ripe for the conditions that maximize implicit bias.

Dr. Sukhera says medical training may partially explain some of the decisions physicians make. “We learn by using biases and heuristics to identify patterns,” he says. “It is part of how we make clinical decisions. So that’s an important aspect of recognizing and managing biases because how we teach about things reflects how we make clinical decisions, which may also reflect problems for certain groups.”  

Dr. Sukhera drew on an example from his own practice. “I work with young people with mental illness. Although 90-plus percent of people with eating disorders aren’t young boys, it’s still very important for me to consider the possibility of an eating disorder when assessing a young man and not let my bias that this is primarily something that affects young women get in the way. Otherwise, I’ll miss that diagnosis in that group,” said Dr. Sukhera. 

Because implicit bias is unconscious, it can be hard to capture and confront. Fortunately, researchers have developed tools to bring our hidden biases to our awareness. 

The most well-known test, Harvard’s Implicit Association Test (IAT), measures attitudes and beliefs that people may not know they held. Using timed-response exercises, the IAT detects the strength of a person’s subconscious association between mental representations of objects (concepts) in memory. In recent years, the test has been criticized, with the caution that the results many not be as definitive as they suggest. However, at the very least, say experts, it has value as a directional pointer to explore one’s blind spots. 

It can be hard to realize — and acknowledge — that we don’t think like we think we think. When the health-care professionals who participated in Dr. Sukhera’s studies became aware of their biases through IAT, it led to significant upset and defensiveness.

“[The results] conflicted with an idealized version of ourselves, as members who belong to a profession where you are not allowed to be biased or non-objective,” he says. “And I think that’s really critical learning from our research because it highlights first of all, how difficult it is to reconcile the idea that we are flawed, and it also highlights that when we’re teaching and learning about things like racial bias, we are doing it in a culture that is trying to push people to become impossible fantasy versions of who they really are.”

If we want to address the harmful impact of bias, said Dr. Sukhera, we must normalize bias. “Having bias is not something that should make us feel ashamed or guilty. The very first step in the journey to awareness is having the humility to accept our shortcomings and our vulnerabilities and, in fact, to embrace those imperfections,” he said. 

In his training of physicians who want to change their behaviour, Dr. Sukhera’s first priority is to create a safe environment that allows people to lean in to their humanity. “We are not androids,” he says. “We are always going to have some bias, but there are ways to mitigate the negative impact.” 

He has found it useful, for example, for participants to engage in perspective-taking exercises “to allow them to really dig deep and imagine what this experience looks like from another perspective.”

Ms. Capatosto says physicians who have identified a particular bias within themselves should actively seek out experiences and relationships with targeted individuals that will prove those stereotypes as false. “If we do that, we can build empathy for others, and have a more honest and reflective view of our own values,” she said. 

Significantly, Dr. Sukhera urges physicians, first and foremost, to be compassionate with themselves.  “The instinct in people who work in health professions is to do more and to do extra, which ends up depleting people’s reservoirs of kindness. We are much better at managing the instinctive effects of our biases if we take care of ourselves.”

Ms. Capatosto says the path forward lies in the will to reflect and then amend our behaviours accordingly. “People have the capacity to change,” she says. “People have the capacity to learn. People have the capacity to grow. All of these are true when we think of the slow and unpredictable journey that any one individual takes to get to the place of awareness.”

Quick Tips

Beaware of the existence of unconscious bias and that it may be influencing your behaviour towards patients and learners.

Reflect on the information and messages you receive throughout the day in media, news, advertisements, etc. and consider how this might be unconsciously influencing your views towards particular groups or individuals.

Explore the concepts of thinking fast and slow. Thinking fast is unconscious and leads quickly to an answer, while thinking slow takes deliberation.

References

Proceedings of the Diversity and Inclusion Innovation Forum: Unconscious Bias in Academic Medicine How the Prejudices We Don’t Know We Have Affect Medical Education, Medical Careers, and Patient Health \\ Association of American Medical Colleges and The Kirwan Institute for the Study of Race and Ethnicity at The Ohio State University

Banaji MR Greenwald AG. Blindspot: hidden biases of good people 1st ed. USA Delacorte Press 2013

Gaertner S, Dovidio JF. The aversive form of racism. In: Dovidio JF, Gaertner S, editors. Prejudice, discrimination, and racism. Orlando: Academic Press; 1986. pp. 61–89.

Chapman EN, Kaatz A, Carnes M. Physicians and implicit bias: how doctors may unwittingly perpetuate health care disparities. J Gen Intern Med 2013;28:1504-10.

Schwartz MB, Chambliss HO, Brownell KD, et al. Weight bias among health professionals specializing in obesity. Obes Res 2003;11:1033-9.

Valanis BG, Bowen DJ, Bassfort T, Whitlock E, Charney P, Carter RA. Sexual orientation and health: comparisons in the women’s health initiative sample. Archives of Family Medicine. 2000;9(9):843-53.

Mossey JM. Defining racial and ethnic disparities in pain management. Clin Orthop Relat Res 2011;469:1859-70.

Todd KH, Samaroo N, Hoffman JR. Ethnicity as a risk factor for inadequate emergency department analgesia. JAMA 1993;269:1537-9.

Uncapher H, Arean PA. Physicians are less willing to treat suicidal ideation in older patients. J Am Geriatr Soc 2000;48:188-92.

Statistics Canada 2019

Stone, J, Moscovitz GB. Non-conscious bias in medical decision making: what can be done to reduce it? Medical Education 2011; 45:8:759-857.

Ross, HJ. Every Day Bias: Identifying and Navigating Unconscious Judgments in our Daily Lives. Rowman & Littlefield 2020.