Just a few years after countless organizations pledged to address systemic racism and build a more equitable workplace, diversity, equity, and inclusion (DEI) efforts are now facing significant backlash. Critics claim that DEI undermines merit, reduces quality, and, in extreme cases, even poses risks. Yet, evidence from the health care field suggests the reality is quite the opposite.
Racial Health Disparities
Racial health disparities in the United States remain both alarming and deeply entrenched. Black Americans experience shorter life expectancies, higher vulnerability to chronic conditions, and elevated rates of hypertension, heart disease, and other illnesses compared to other groups. At the same time, they face barriers to accessing care, and when care is available, they often encounter medical racism, receiving less thorough or attentive treatment than white patients.
Research also shows that repeated exposure to racial discrimination can worsen health outcomes, compounding the problem. As a result, Black Americans live within a system that not only imposes social pressures harmful to health but is also supported by providers who too often reinforce, rather than reduce, these inequities.
Given the life-and-death stakes of health care, it may seem logical to agree with critics who argue that DEI lowers standards. After all, wouldn’t we want the clinicians diagnosing illnesses and guiding treatment to be the most capable people for the job? The evidence, however, tells a different story: diversity in health care doesn’t weaken quality, it strengthens it.
Benefits of DEI on Health
Research consistently shows that increasing the representation of women across all races and Black men in medicine leads to better outcomes and helps reduce health disparities.
These groups contribute valuable perspectives, lived experiences, and cultural understanding that enrich medical practice. Such diversity is especially important in caring for a wide range of patients with different needs.
For example, greater representation of Black men in medicine has been linked to reductions in racial health disparities, including fewer cardiovascular deaths among Black male patients.
Similarly, women experiencing heart attacks have improved outcomes when treated by female physicians. The evidence is clear: diversity in health care delivers tangible, measurable benefits, strengthening the profession rather than weakening it.
Healthcare Workers Benefit From DEI
Patients clearly benefit from greater diversity in health care, but the advantages extend to the workforce as well. Although Black men make up only about 2% of physicians, research shows they are often attuned to the gender biases within this male-dominated field and actively support their women colleagues. That support can take many forms; mentorship, advocacy, or calling out “old boys’ club” dynamics that disadvantage women.
Gender diversity also plays a role in improving patient care. Studies indicate that male doctors working in teams with higher numbers of women physicians also deliver improved results. This suggests that the skills and approaches women doctors bring to patient care may positively influence their male counterparts.
Health care offers a powerful example of how diversity benefits both patients and practitioners. Far from diminishing quality, diversity strengthens it, and in medicine, it can mean the difference between life and death. As backlash against diversity efforts grows, it is worth asking how limiting representation may not only undermine progress in health care but also harm other industries where inclusion could be equally transformative.


