fbpx

Recognizing and Limiting the Impact of Unconscious Bias

Join Stop Charting at Home absolutely free & learn to leave your work at work

In this post, we continue our exploration of bias, looking specifically at how recognizing bias and awareness about it is the first step for mitigating its impact.


If you haven’t yet read last week’s post or need a refresher, then be sure to read it first here.

Last week’s look at Our Biased Brains highlighted how bias forms and how that process is driven by our unconscious brain in a way that we don’t realize is occurring. It was unnerving for me to learn more about this process, and I’m guessing that’s true for many of you. 

After all, in a way, bias can be viewed as robbing us of our own free thought and will—of our autonomy—if our perceptions and reality as we know them are shaped so strongly by passive inputs. Remember, we’re talking over 10 million inputs per second! 

This is why it’s critical to understand bias and where it comes from. We must understand that it’s normal and inevitable. To have biased thoughts is to have a normally functioning human brain, troubling as that may sound. 

In order to make inroads with handling our bias better, we need to continue learning more about it.

The Fabled Five

Recognizing bias is one of our best tools to combat its impacts. Facilitating this endeavor is categorizing bias which aids in identifying it.

There are numerous categories, but five encompass the majority of bias in our lives. Each of these is an unconscious filter through which we process the world: 

  1. Cultural Values and Beliefs 
  2.  Affinity
  3.  Difference
  4.  Confirmation
  5.  Primacy

1. Cultural Value and Belief Bias

This bias arises from the fact that we link certain behaviors and beliefs to shared values. In other cultures, the exact opposite could be true. 

For example, interrupting someone in conversation can be viewed differently based on values. One person may feel compelled culturally to interrupt someone if they’ve stated something incorrectly or if it calls for added context only that person can provide in the moment. 

And interruption to another person may be anathema to their beliefs and they avoid doing so. This person may view an interruptor as having a moral failing.

Another example comes from my recent viewing of the remake of Dune

In the movie is a scene where Leto Atreides meets the leader of the local population, the Fremen, on the planet Arrakis. Upon introductions, the Fremen leader spits in front of Leto—seemingly at him—and his guards draw their weapons at this perceived slight. 

However, someone familiar with Fremen ways explains that this is a great honor to “waste” saliva for a guest (since they are on a desert planet where water is scarce). Leto returns the gesture as is custom, and the meeting proceeds. 

He looks overjoyed to be bestowed such an honor.

2. Affinity Bias

We show more affinity to those who are similar to us. We are more comfortable around and forgiving towards them.

This impacts with whom we associate, befriend, and promote, etc. 

Last week, I gave the example of Ivy Leaguers only wanting to promote fellow Ivy Leaguers within an organization. This is because affinity bias meant supporting those similar to the leaders at the expense of those who were not, even if they were otherwise equally qualified. 

If you look at rare disease funding and research at the federal level, diseases unique to white people garner more academic attention and dollars. Federal funding per person with cystic fibrosis (a condition primarily affecting white people) is $2800, about 3.5x the funding per person with sickle cell disease (a condition primarily afflicting people of color), which is $800. This disparity is further highlighted by their relative prevalence: around 30,000 people live with cystic fibrosis while about 100,000 live with sickle cell disease.

This pattern holds true for foundation funding for these diseases. We know that most wealth in the US is disproportionately concentrated in the hands of white Americans compared to Americans of color. When donating money towards diseases, Americans more frequently donate to causes that impact others who are similar to themselves. 

3. Difference Bias

This bias runs in contradistinction to affinity bias. We identify those dissimilar to us as “other” and seek to confirm our views about them while dismissing anything contradictory to our views. 

In school, black students are 3x more likely to be disciplined for misconduct compared to white students. Note, they are not transgressing 3x more than the white students, but are punished at a higher rate. They are viewed as more problematic and punished as the “other.” 

Patients of color are 22% less likely to receive pain medications for similarly painful presentations. They may be more likely viewed as “drug-seeking.”

In Minneapolis, we have data that black patients presenting with sickle cell crises (considered at least as painful as a broken bone) are less likely to be given opioids for their fracture-equivalent pain. Again, the “other” is treated differently. 

Another example: many patients don’t want physicians with foreign-sounding names because these patients assume that such physicians are non-native English speakers and will be difficult to understand. 

I’ve been told by patients that they can barely understand my accent.

Y’all, I grew up in Kansas. Like, from where Superman grew up America (hmm…maybe a bad example since he’s an alien). My accent is undoubtedly American. 

I can only imagine those patients’ brains were suppressing the clearly understandable American accent in which I was speaking in order for them to still believe I had a “ foreign” accent just based on my “foreign-sounding” name.

4. Confirmation Bias

We seek information that confirms our currently held beliefs.  Our brains elevate and promote this information while overlooking or suppressing contradictory information subconsciously. 

As you can tell from many of the examples I’ve given already, there is a lot of overlap between confirmation bias and the other subconscious filters.

Here’s an age-old one that most of you have witnessed, if not experienced: patients believing any woman caring for them must be a nurse and not their physician. 

This belief is part active experience—most of their nursing team is likely to be female—and part passive experience—e.g., from depictions on TV. And every time they have female nurses and male physicians, the stereotypes gain more subconscious traction. 

What comes to mind when you think of a CEO of a company? Chances are a white male who is at least 6 feet tall.  The white male part probably didn’t surprise you, but if you’re like me, the height association certainly did! 

Sixty percent of CEOs are greater than six feet tall despite less than 15% of the population being that tall.  Apparently, height is associated with success and leadership—i.e., CEO material.

Who’s more CEO-like?

Confirmation bias can also tie into other biases like the next one.

5. Primacy Bias

I mentioned this one in the last post when I referenced resumés. 

We are more likely to recall information better from early on and more likely to forget info presented later. 

This relates closely with recency bias (more likely to give greater importance to a recent event over a remote event) and anchoring bias (over-emphasizing or relying on the first piece of data we receive). 

Primacy and anchoring can impact the formation of differential diagnoses. If the patient we accept from the emergency department comes with a provisional diagnosis, we are more likely to trust that that is the diagnosis since it’s what we heard first. 

Critical Systems Failure

All the above categories and examples relate to bias at the individual level. What happens when you take a population look at bias?

Systematic bias!  Yes, similar people share similar biases, and shared patterns of unconscious judgment create unique and comprehensive advantages for certain dominant groups.

These show up in how we shape our societies and the institutions on which we rely. 

Don’t think this exists? Just look at our education and criminal justice systems here in the US. Systematic bias in both leads to the racial disparities we see—achievement gaps for students of color, the disproportionate incarceration of people of color, the disproportionate use of deadly force against people of color. 

These systems and institutions are designed to produce these outcomes regardless of the intentions of those who work within them.

Said differently, even non-racist people working in the educational or criminal justice systems inadvertently contribute to disparate outcomes because of the innate biases upon which the systems were built. The simple functioning of the institution propagates the disparity.

The bedrock of our criminal justice system was established during the undeniably racist Jim Crow era, so it shouldn’t come as a surprise that this system disadvantages (read “incarcerates and kills”) people of color.

The Power of Bias

Therein lies the power of bias: unconsciously chosen data may inform judgments that are inaccurate and can be amplified across groups to become shared judgments that benefit one group at the expense of others and that come to underpin everything about our modern world!

Phew! That was a lot to take in!

A lot of judgments come from fear, from a place of self-preservation. Remember, biases are normal and have their evolutionary uses but also have deleterious effects in the society we’ve (or some some segment of biased folks have) shaped.

Be Conscious About the Unconscious

We can’t help but be biased. And while the negative impacts of bias are real and pervasive, we need to be neutral about bias itself.

I’m not saying to be neutral about the effects of bias—e.g., the examples I’ve given thus far. But normalizing the experience of being biased, in my armchair expert opinion, is the first step in dismantling its impacts and manifestations. 

Based on the science underlying bias (see last week’s post), normalization is justified and can help foster discussion by preventing the immediate defensiveness that more often than not accompanies accusations of being biased.

How then do we bring that which is unconscious to the forefront to be addressed? With awareness.

Awareness allows us to intentionally make better, more conscious decisions based on higher fidelity data.

Another feature of being human is our ability to think about what we’re thinking when we think it (so meta!). 

So we can decide if a biased thought is what we want to think and train ourselves to pause and recognize that we can actively choose if that thought serves our goals. 

How can we become more aware? We can use tools to show us our blind spots. 

You Down with IAT

“Yeah, you don’t know me,”  says most everyone who takes one of Harvard’s Implicit Association Tests (IATs), one set of such tools.

That’s because these IATs force us to look at whether we have unconscious associations that match stereotypes about groups of people. The evidence-based results may be rife with unpleasant, hard to swallow truths about ourselves.

Sounds like fun, right? Check it out if you’re willing. Just don’t use the results to beat yourself up.

Their data is quite robust and illuminating. For instance, they looked at how testers associate societal role (career or family) with each gender (their term).

Over the span of a decade, the responses from one million testers demonstrated that 75% of them associate women with family and men with career. Only 17% of testers show no preference for one or the other. This was true for both men and women testers.

Strategies Against Bias

After awareness, we can take action against bias. 

1. Ensure Diversity. A plethora of voices tends to drown out cognitive shortcuts that lead to stereotypes. However, this requires a critical mass of and not just token representation. Diversity does not mean just along racial or ethnic lines, but also sex, socio-economic, and other lines.

2. Check Systematic Bias. As demonstrated above, assumptions inherent in our systems can impact what we deem as important. Therefore, we should review systems and processes with a bias for uncovering biases (see what I did there?).

Assume unconscious bias is always at play and see if you can uncover them in order to address them.

3. Dedicate Time. Our first attempt or pass at something is often unfiltered and ridden with bias. Take time like mentioned in #2 above to revise. Scheduling time to do this demonstrates the willingness to put time (and attention and focus) on it. Make it part of the process.

4. Standardize Interaction & Input Norms. Give time and space to elevate all voices and encourage dissenting opinions that may help uncover and call-out biases only recognized by a few.  

Thinking Our Way Out of Bias

As you can see, there are mechanisms to mitigate the impact of bias. One important facet to those strategies is to essentially generate more active inputs of higher fidelity (or, perhaps, simply more nuanced) data on which we can base our thoughts. 

This takes awareness and intentionality—using thought tactics to limit thought errors. 

Bias can seem daunting to address, and it is. But it can be done, even if slowly and piecemeal.

It’s worth the effort. And it starts with us.

How can you employ steps to mitigate bias’ impact? Let me know in the comments! 

If you haven’t subscribed to my email list, then do so below so you don’t miss my new posts or my weekly updates (only for subscribers).

I’d also be most appreciative if you shared this post with anyone whom you think would benefit from the content or message of the blog. They may similarly be most appreciative 😀.

Like what you've read? Join the mailing list for extra weekly content!

Click below
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x