Insight
Automating inequality: When AI undervalues women’s care needs

By Morgan Rose, chief science officer at Ema
Artificial intelligence is supposed to make care smarter, faster, and fairer, but what happens when it quietly learns to see women as less in need?
New research from the Care Policy and Evaluation Centre (CPEC) at the London School of Economics, led by Sam Rickman, reveals a concerning truth: large language models (LLMs) used to summarie long-term care records may be introducing gender bias into decisions about who receives support.
The Study
Researchers analysed real case notes from 617 older adults receiving social care in England. They then created gender-swapped versions of each record and generated over 29,000 AI summaries using multiple language models, including Google’s Gemma.’
The goal was simple: would AI treat men’s and women’s needs the same way?
It didn’t.
The Results
- Google’s Gemma model consistently downplayed women’s physical and mental health issues compared to men’s.
- Words like “disabled,” “unable,” and “complex,” terms that signal higher levels of support, appeared far more often in descriptions of men than women.
- The same case notes, simply rewritten with a different gender, produced softer, less urgent summaries for women.
In other words, when the algorithm rewrote her story, her needs shrank.
The Cost of Softer Language
Language isn’t neutral. In healthcare, it’s the difference between monitor and act.
Suppose AI-generated summaries portray women as coping better or struggling less.
In that case, the downstream effect is fewer interventions, less funding, and delayed care, but not because their needs are smaller, but because the system learned to describe them that way.
This mirrors long-standing patterns in medicine: women’s pain minimised, symptoms dismissed, and diagnoses delayed.
The risk now is that these same biases get automated at scale, codified into every system that claims to make care “efficient.”
Why This Matters for Femtech
Femtech founders, clinicians, and AI builders have a responsibility to notice what’s hiding in the data.
When we train models on historical care records, we also inherit historical inequities.
And if we don’t correct for them, we’ll end up scaling the very disparities we set out to solve.
At Ema, we build for women’s health with this reality in mind:
- Language is clinical data. Every word shapes care pathways.
- Bias is not neutralised by scale. It’s magnified by it.
- Ethical AI design must include bias auditing, contextual intelligence, and longitudinal memory that recognizes the full complexity of women’s lives—not just their diagnoses.
The Path Forward
Fixing this isn’t about scrapping AI.
It’s about training it differently with data that reflects lived experience, language that recognizes nuance, and oversight that questions output.
Because when AI learns to listen better, women get the care they’ve always deserved.
Source:
Insight
Working from home linked to higher fertility, research finds
Insight
Radiotherapy may cut lymphoedema risk
Insight
Report makes the case for an incentive change in health data

In a new report, “The Case for Incentive Change in Healthcare Data,” WHIS Lead Producer Poppy Howard-Wall explores why healthcare’s biggest data challenge may not be technical but economic.
Integrating learnings from Poppy’s conversations with senior leaders at the ViVE Summit, the report highlights how fragmented data and misaligned incentives continue to limit the industry’s ability to deliver truly longitudinal care.
Howard-Wall writes: “For the women’s health industry, where many conditions have historically been under-researched and longitudinal datasets remain incomplete, the consequences of fragmented data infrastructure are even more pronounced.
“Artificial intelligence promises to accelerate discovery, improve diagnosis and enable more proactive care. But its potential is inseparable from the data ecosystems that support it.
“In the absence of strong economic incentives for deeper integration, the question becomes how the industry is beginning to navigate this constraint and what signals are emerging about the future of healthcare data and AI in women’s health.”
International Women's Day 20264 weeks agoWomen’s health innovation needs infrastructure, not just investment
Diagnosis4 weeks agoHome urine tests could detect breast cancer, endometriosis and PCOS
Mental health4 weeks agoTackling women’s mental health with music and tech
Diagnosis3 weeks agoBlood test shows promise in endometriosis
Features4 weeks agoKorean firm launches plant-based period pads in US
Insight4 weeks agoWomen’s health leaders warn of censorship
International Women's Day 20264 weeks agoWhy lung cancer belongs in the women’s health conversation
Menopause3 weeks agoStudy reveals hidden menopause tech privacy concerns














