Skip to content
  • Facebook
  • X
  • Linkedin
  • WhatsApp
  • Associate Journalism
  • About Us
  • Privacy Policy
  • 033-46046046
  • editor@artifex.news
Artifex.News

Artifex.News

Stay Connected. Stay Informed.

  • Breaking News
  • World
  • Nation
  • Sports
  • Business
  • Science
  • Entertainment
  • Lifestyle
  • Toggle search form
  • Top Court Issues Notice On Petition Against Bar On Transgenders, Gay Men Donating Blood Nation
  • “In Death Overs…”: Ravindra Jadeja’s Big Reveal On India’s Surprise Strategy In Super 8 At T20 World Cup Sports
  • India’s T20 World Cup Squad: 38-year-Old Veteran Throws Hat Into The Ring, Says “Will Do Everything” Sports
  • Government Reaches Out To INDIA For Consensus Over Speaker: Sources Nation
  • Cricket World Cup – ‘Upsets Happen When…’: Virat Kohli’s Message To Indian Cricket Team Ahead Of Bangladesh Match Sports
  • “No One Was Agreeing…”: Shreyas Iyer Takes Aim At BCCI Over Back Injury Sports
  • King’s Cup: Team India Leaves Scars Of Semifinal Loss Behind; Ready To Outfox Lebanon Challenge Sports
  • Shahid Afridi’s Blunt Message To PCB On Son-In-Law Shaheen’s Likely Sacking As T20 Captain Sports

How AI Can Fight Sex And Gender Bias In Healthcare

Posted on October 25, 2024 By admin



From assisting doctors with diagnoses to suggesting advanced treatments, Artificial Intelligence (AI) is transforming health and medicine. But AI has predominantly been developed by men, based on data sets that prioritise men’s bodies and health needs. That means many AI models are riddled with gender and sex biases – posing a health risk to women, as well as nonbinary patients.

With these biases in medicine coming under the spotlight in recent years, will AI widen existing healthcare inequities – or can it be harnessed to help bridge the gap?

Biased data

The calibre of AI totally depends on the quality of the large data sets that are fed into the underlying machine learning algorithms within its software programs.

If data either excludes or under-represents relevant sectors of the global population, ill-informed AI can pose serious health risks – from missed diagnoses, to compromising the interpretation of medical imaging, to incorrect intervention recommendations.

Problems start with gender biases underlying the very coding of the AI software language.

Infiltration of masculine stereotypes into AI have emerged – from the apparently unconscious default to the male pronoun “he” when options are ambiguous, to alarming healthcare applications that threaten diagnosis and treatment.

For example, in the field of psychiatry, when men describe trauma symptoms, they are more likely to be diagnosed with post-traumatic stress disorder (PTSD), while women describing the same symptoms are at higher risk of receiving a personality disorder diagnosis.

This kind of gender bias can (and often does) influence a women’s access to health care or her management within the healthcare system – and it appears this bias is replicated in AI models.

A 2020 study from the US found that Natural Language Processing AI models used in psychiatry demonstrate significant gender biases.

The research paper, published in PLos One, warned that AI models that screen for psychopathology or suicide will make mistakes if they are trained predominantly on data written by white men, because language is shaped by gender. Men and women express suicidal distress differently, for example.

Crucially, awareness of these types of issues is gathering and initiatives to avert bias are emerging – often driven by women, such as Bioinfo4women-B4W, a program of the Barcelona Supercomputing Centre.

This example also reminds us that considerations around bias and gendered language in AI must extend beyond the English language, in order to be relevant to AI development around the globe.

Opportunities for inclusive design

But concerns don’t stop at the level of language. What if something so basic as our body build is not considered when AI is being developed?

As the use of AI expands into safety product design, we have an unprecedented opportunity to build better products by crafting in features that adequately cater to our human bodies – female and male.

Average female and male bodies have proportionality differences; we can’t simply scale from one to the other.

This point was driven home during the COVID pandemic, when wearing personal protective equipment (PPE) became mandatory.

Despite around 70 percent of global healthcare workers being women, PPE has been designed around a male body. A Canadian survey identified that ill-fitting PPE was not only responsible for a failure to offer adequate protection, but also that oversized and ill-fitting gear posed a significant accident risk.

More studies are needed on this topic, but researchers have already proposed building AI-designed PPE. Ensuring that sex traits are considered in PPE design could be expected to improve safety.

Moves in the right direction

The accuracy of AI-assisted clinical diagnoses is completely reliant on the robustness of the underpinning data sets. Without actively accounting for sex and gender bias in historical data sets, AI may contribute to missed or mis-diagnoses.

Fortunately, adjusting for such biases appears to lead to better healthcare outcomes for women.

For example, the traditional risk assessment score for heart attacks, the Global Registry of Acute Coronary Events (GRACE), was updated in 2022 to incorporate AI predictive models that account for sex-specific disease characteristics.

This update has revolutionised the performance of this assessment tool. The success stems from separate analysis of male and female data- which guides more female patients to lifesaving early intervention, helping overcome structural biases in patient management.

A practical example of an AI model designed to address and reduce gender bias is SMARThealth Pregnancy GPT. This tool, developed by The George Institute for Global Health, aims to improve access to guideline-based pregnancy advice for women living in rural and remote communities in India.

The concept was to develop a large language model chatbot that would be contextually sensitive and clinically accurate – and avoid entrenching harmful stereotypes.

The George Institute team worked closely with community health workers, clinicians and women living in rural communities, to co-create and refine the tool’s algorithm. Clinicians also scored AI-generated answers on accuracy, appropriateness for community health workers, completeness and risk of bias, which helped improve the chatbot’s responses.

The chatbot showcases AI’s potential in building healthcare worker capacity and enhancing health education in resource-limited settings – while avoiding bias and promoting women’s rights.

Gender-sensitive AI development could similarly improve countless other medical technologies that rely on data diversity and integrity for accuracy: for example, tailoring personalised treatments; predicting treatment responses; performing certain robot-assisted surgeries; monitoring patients remotely; virtual health care; and acceleration of drug discovery.

Initiatives to advance improved sex and gender equity in healthcare have begun to emerge in recent years, too. They include the newly launched Australian Centre for Sex and Gender Equity in Health and Medicine and the UK Medical Science Sex and Gender Equity.

These programs are actively advocating for routine consideration of sex and gender from discovery to translational research, including AI applications, to ensure scientific rigour as a robust foundation for advancing health and medical care.

AI is the future of healthcare, and we can’t afford to replicate the past mistakes of health inequities perpetrated by ignoring sex and gender. It is time to program AI to chart our course toward an ethical destiny.

(Disclaimer: The information in this article is provided for educational purposes and is not to be taken as medical advice.)

(Authors: Dr Sue Haupt is a Senior Research Fellow at the Centre for Sex and Gender Equity in Health and Medicine at the George Institute for Global Health at UNSW, an Honorary Senior Research Fellow at Deakin University and also at the Sir Peter MacCallum Department of Oncology, University of Melbourne. Prof Bronwyn Graham is the Director of the Centre for Sex and Gender Equity in Health and Medicine at the George Institute for Global Health and a Professor in the School of Psychology, UNSW. Prof Jane Hirst is the Program Director of Women’s Health at the George Institute for Global Health, School of Public Health, Imperial College London.

(Originally published under Creative Commons by 360info)

(Except for the headline, this story has not been edited by NDTV staff and is published from a syndicated feed.)




Source link

World Tags:AI, artificial intelligence, Gender Bias In Healthcare

Post navigation

Previous Post: White House presses Government AI use with eye on security, guardrails
Next Post: Militants attack a security post in northwest Pakistan, killing 10 officers

Related Posts

  • 5.5 Earthquake Hits US’s New Jersey, Tremors In New York City World
  • Raisina Dialogue | Support Ukraine, join Swiss peace conference on conflict: European Ministers tell India World
  • Benjamin Netanyahu Says Israel Is Preparing Ground Invasion Of Gaza World
  • Digital Afterlife Industry Rapidly Evolving, Know How To Navigate Its Risks World
  • Islamic State killed more than 4,000 since Syria territorial defeat: monitor World
  • Iraq Sentences Terrorist To Death Over 2014 Pilgrim Bombing World

More Related Articles

BBC Asks Former Presenter To Return Salary After His Arrest For Sex Offence World
Russia bans entry of 13 Japanese executives in response to Japan’s ongoing sanctions World
With Eye On China, US And Vietnam Sign Historic Pacts During Joe Biden Visit World
China Boarding School Punishes Student For Using Toilet Late At Night, Internet Calls It A “Prison” World
Calls for revenge echo at Haniyeh’s funeral; Tehran vows ‘punishment’ World
Israel-Hezbollah conflict LIVE updates: Leaves nearly 100 people killed World
SiteLock

Archives

  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • December 2023
  • November 2023
  • October 2023
  • September 2023
  • August 2023
  • July 2023
  • June 2023
  • May 2023
  • April 2023
  • March 2023
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022

Categories

  • Business
  • Nation
  • Science
  • Sports
  • World

Recent Posts

  • “For Any Visiting Team…”: Sachin Tendulkar’s ‘Phenomenal’ Reaction After New Zealand Ends India’s 12-Year Streak
  • Israel kills two Iranian soldiers in pre-dawn airstrikes, warns Iran against retaliation
  • “Need To Chat With Certain Individuals”: Rohit Sharma Makes Intentions Clear After India’s Historic Loss vs NZ
  • S Jaishankar On China Pact
  • S Jaishankar On China Pact

Recent Comments

  1. dfb{{98991*97996}}xca on UP Teacher Who Asked Students To Slap Muslim Classmate
  2. "dfbzzzzzzzzbbbccccdddeeexca".replace("z","o") on UP Teacher Who Asked Students To Slap Muslim Classmate
  3. 1}}"}}'}}1%>"%>'%> on UP Teacher Who Asked Students To Slap Muslim Classmate
  4. bfg6520<s1﹥s2ʺs3ʹhjl6520 on UP Teacher Who Asked Students To Slap Muslim Classmate
  5. pHqghUme9356321 on UP Teacher Who Asked Students To Slap Muslim Classmate
  • Rai Benjamin Anchors US To Men’s 4x400m Relay Gold Sports
  • Israel’s military says it’s taken control of a strategic corridor along Gaza’s border with Egypt World
  • North Korea’s Kim Rides In Car Given By Putin As He Oversees Air Drills World
  • Markets bounce back in early trade on firm global trends; buying in ITC, Reliance Business
  • We have to produce the best in every match we play: Molina Sports
  • UP Principal Forcibly Removed From Office, Her Replacement Watches Nation
  • PM Narendra Modi To Address Huge Gathering Of Indian Community In US Today: 10 Points World
  • Nipah Virus Detected In 39-Year-Old Man, Total 6 Cases In Kerala Now Nation

Editor-in-Chief:
Mohammad Ariff,
MSW, MAJMC, BSW, DTL, CTS, CNM, CCR, CAL, RSL, ASOC.
editor@artifex.news

Associate Editors:
1. Zenellis R. Tuba,
zenelis@artifex.news
2. Haris Daniyel
daniyel@artifex.news

Photograher:
Rohan Das
rohan@artifex.news

Artifex.News offers Online Paid Internships to college students from India and Abroad. Interns will get a PRESS CARD and other online offers.
Send your CV (Subjectline: Paid Internship) to internship@artifex.news

Links:
Associate Journalism
About Us
Privacy Policy

News Links:
Breaking News
World
Nation
Sports
Business
Entertainment
Lifestyle

Registered Office:
72/A, Elliot Road, Kolkata - 700016
Tel: 033-22277777, 033-22172217
Email: office@artifex.news

Editorial Office / News Desk:
No. 13, Mezzanine Floor, Esplanade Metro Rail Station,
12 J. L. Nehru Road, Kolkata - 700069.
(Entry from Gate No. 5)
Tel: 033-46011099, 033-46046046
Email: editor@artifex.news

Copyright © 2023 Artifex.News Newsportal designed by Artifex Infotech.