When Code Meets Culture
Sai Lavanya Patnala, M.B.B.S.

Keywords: cultural competence, AI in healthcare, sensitivity, bias
Cultural sensitivity in medical practice has long been a challenge, even before the rise of Artificial Intelligence (AI). Medicine is not one-size-fits-all, and this is especially true in the context of ethnic diversity. While progress is being made through growing awareness and the introduction of modern solutions, there is still a significant journey ahead.
Culture influences communication beyond language. Perception, thinking, and acting are guided by cultural affiliation. Even the way a patient expresses a concern can vary widely, not only across cultures but also among different groups within a culture. Misinterpretations often arise from differing assumptions, values, and behavioral norms, leading to potential gaps in care. Researchers from the Netherlands identified key predictors of culture-related communication problems, including cultural differences in explanatory models of health and illness, differences in cultural values, cultural differences in patients’ preferences for doctor–patient relationships, racism/perceptual biases, and linguistic barriers1.
Cultural Bias in Healthcare

Different cultural groups express and cope with mental health in distinct ways. Asian patients may somatize emotional distress, expressing deeper issues only with probing, while African American individuals often turn to spirituality as a coping strategy2. In dermatology, conditions like melasma, postinflammatory hyperpigmentation, keloids, and alopecia present differently in skin of color, which can create a delay in diagnosis and risk of less screening3.
Such disparities and biases are not only prevalent in clinical environments but also carry over into language and machine learning models. These biases often stem from the over- or under-representation of certain populations in training datasets, reflecting historical influence rooted in racism, sexism, and other types of socioeconomic biases. When such data forms the basis for training machine learning (ML) algorithms, they risk perpetuating the same biases4.
Identifying the Gaps in Artificial Intelligence
AI solutions have proven to be beneficial for patients in areas of clinical oncology, dermatology, prediction of postpartum depression, nutrition counselling, and have been emerging in preventative care and the medical robot sector4. ML algorithms excel at recognizing patterns and anomalies in imaging modalities such as X-ray, MRI, and CT scans and can ensure timely diagnoses. For instance, deep learning models have shown exceptional accuracy in detecting diabetic retinopathy, which can enable early intervention, especially in regions with limited access to ophthalmologists.5
However, many AI models are trained on datasets that often come from predominantly White cohorts and may exclude ethno-racial information, or when included, have historically been incorporated incorrectly. For instance, race-based adjustments in pulmonary function tests and pain scores are still widely applied, contributing to poorer health outcomes for Black, Indigenous, and People of Colour4.
Studies also revealed gender-based disparities in AI, many of which are trained predominantly on male datasets, with some reporting an error rate of up to 47.3% in identifying heart disease in women compared to just 3.9% in men. Similarly, certain dermatological AI models accurately diagnose skin conditions in light-skinned individuals but perform poorly for those with darker skin, with an error rate as high as 12.3%5. Racial bias has been observed in algorithms used to assess kidney function, essential in managing chronic kidney disease6.
Moving Forward: Strategies for Change
The goal of culturally competent care is to provide consistent quality of care to every patient, regardless of their cultural, ethnic, racial, or religious background. This requires awareness, attitude, knowledge, skills – collectively called cultural competence, which, at its core, is patient-centered and emphasizes respect, sensitivity, trust, curiosity, and tolerance7,8.
Expecting an AI model to replicate this level of nuance is a lot to expect. Without careful attention to cultural norms and communication styles, AI risks perpetuating existing biases and widening disparities. However, if implemented thoughtfully, embedding cultural awareness into AI design not only makes these tools more accurate and relevant but also supports clinicians in bridging cultural gaps and delivering more inclusive care and equitable healthcare.
AI and ML algorithms must be trained inclusively to address biases. They should be designed and utilized in a manner that does not create or maintain health disparities currently experienced by vulnerable groups, and should address and remove existing health disparities4. Diagnostic algorithms should account for variations in disease prevalence, symptom presentation, and genetic factors among different ethnicities. To reduce bias, AI developers must focus on culturally tailored diagnostic models and incorporate datasets representing various ethnic, cultural, and socioeconomic backgrounds5.
Effective user engagement is key to adopting AI and digital health solutions. Providing multilingual support ensures access for non-dominant language speakers. Customizing content and recommendations to align with cultural preferences and health beliefs, for instance, dietary recommendations or health education materials, can be tailored to cultural norms to improve relevance. Collaborating with cultural advisors and using their insights can guide content and engagement strategies5.

It is important to create system-level changes, such as a federal or provincial regulatory framework, to ensure equity in the implementation of AI solutions. From the development stage—monitoring the inclusion of ethnoracial, sex, and gender characteristics—to continuous auditing to identify and correct emerging biases, algorithmovigilance must be embedded in AI projects. Existing models must be debiased by retraining without race variables to promote fairness through unawareness4,5.
Providing culturally sensitive care is already demanding, complicated by linguistic, cultural, and social differences that often complicate doctor–patient communication8. Existing assumptions and unconscious biases among healthcare providers remain persistent barriers to equitable care. Understanding cultural diversity is, therefore, central to ethical and effective care, requiring open-mindedness, self-reflection, and continuous learning.
As we enter the era of Artificial Intelligence, it is vital to recognize that healthcare is not culturally neutral. In this evolving landscape, AI presents both significant opportunities and complex challenges. Culturally uninformed algorithms can amplify existing disparities. However, when guided by clinical and cultural expertise, AI can bridge communication gaps, improve access, and support more equitable, culturally competent care. Therefore, to truly harness the potential of AI in healthcare, active involvement of industry experts in development and implementation is not only important but is the need of the hour.
References:
- Schouten BC, Meeuwesen L. Cultural differences in medical communication: a review of the literature. Patient Educ Couns. 2006 Dec;64(1-3):21-34. doi: 10.1016/j.pec.2005.11.014. Epub 2006 Jan 20. PMID: 16427760.
- Office of the Surgeon General (US); Center for Mental Health Services (US); National Institute of Mental Health (US). Mental Health: Culture, Race, and Ethnicity: A Supplement to Mental Health: A Report of the Surgeon General. Rockville (MD): Substance Abuse and Mental Health Services Administration (US); 2001 Aug. Chapter 2 Culture Counts: The Influence of Culture and Society on Mental Health. Available from: https://www.ncbi.nlm.nih.gov/books/NBK44249/
- Taylor SC. Epidemiology of skin diseases in ethnic populations. Dermatol Clin. 2003 Oct;21(4):601-7. doi: 10.1016/s0733-8635(03)00075-5. PMID: 14717401.
- Gurevich, E., Hassan, B. E., & Morr, C. E. (2022). Equity within AI systems: What can health leaders expect? Healthcare Management Forum, 36(2), 119. https://doi.org/10.1177/08404704221125368
- Nivisha Parag, Rowen Govender, Saadiya Bibi Ally. 2023. Promoting Cultural Inclusivity in Healthcare Artificial Intelligence: A Framework for Ensuring Diversity. Health Management, Policy and Innovation (www.HMPI.org), Volume 8, Issue 3.
- Marko, J.G.O., Neagu, C.D. & Anand, P.B. Examining inclusivity: the use of AI and diverse populations in health and social care: a systematic review. BMC Med Inform Decis Mak 25, 57 (2025). https://doi.org/10.1186/s12911-025-02884-1
- Bobel, M. C., Hinai, A. A., & Roslani, A. C. (2022). Cultural Sensitivity and Ethical Considerations. Clinics in Colon and Rectal Surgery, 35(5), 371. https://doi.org/10.1055/s-0042-1746186
- Taylan C, Weber LT. “Don’t let me be misunderstood”: communication with patients from a different cultural background. Pediatr Nephrol. 2023 Mar;38(3):643-649. doi: 10.1007/s00467-022-05573-7. Epub 2022 Aug 5. PMID: 35930048; PMCID: PMC9842546.