News
UK: Babies born to black mothers 81% more likely to die in NHS care
Babies born to black mothers in England and Wales are 81 per cent more likely to die in neonatal units than those born to white mothers, new data has revealed.
An analysis of more than 700,000 babies admitted to NHS neonatal units between 2012 and 2022 also found that those from the most deprived areas faced a 63 per cent higher risk of death.
The study, covering hospitals across England and Wales, revealed what researchers called “deeply concerning” levels of inequality.
Samira Saberian, a PhD student at the University of Liverpool and the study’s lead author, said the findings showed that “socioeconomic and ethnic inequalities independently shape survival in neonatal units, and maternal and birth factors explain only over half of the socioeconomic and ethnic inequalities”.
“To reduce these inequalities, we need integrated approaches that strengthen clinical care while also tackling the wider conditions affecting families,” the researcher added.
“By improving services and addressing the root drivers of inequality, we can give the most vulnerable babies a better chance of survival.”
Neonatal units provide specialist care for premature babies or those born with serious health conditions. The mortality rates reflect deaths before discharge from these units.
Black babies had the highest mortality rates for most of the study years, peaking at 29.7 deaths per 1,000 babies.
The highest rate among white babies was 16.9 per 1,000.
For babies born to mothers in the most deprived areas, the highest mortality rate reached 25.9 per 1,000 in 2022, compared with 12.8 per 1,000 among those from the least deprived areas.
The study is the first to examine both socio-economic and ethnic inequalities in neonatal units.
Researchers said the results highlight factors beyond medical treatment that influence survival rates.
Even after accounting for maternal and birth factors, the 81 per cent higher risk for black babies remained, pointing to structural inequalities that require solutions beyond clinical care.
News
FDA removes warning label from menopause drugs
News
Woman files lawsuit claiming fertility clinic ‘bootcamp’ caused her stroke
A London executive is suing a fertility clinic, alleging its IVF treatment led to her suffering a stroke.
Navkiran Dhillon-Byrne, 51, began private IVF treatment at the Assisted Reproduction and Gynaecology Centre (ARGC) in Wimpole Street, London, in April 2018.
Ten days after her treatment ended, on 28 April 2018, she suffered a stroke, which her lawyers say has left her with ongoing vision problems.
Ms Dhillon-Byrne is now suing the clinic and its head, Mohamed Taranissi, for negligence and breach of duty, saying medics failed to give her sufficient warnings about stroke risks linked to IVIg immunotherapy (intravenous immunoglobulin) – a one-off add-on treatment designed to moderate the body’s immune responses during pregnancy.
The clinic and Dr Taranissi deny liability, saying Ms Dhillon-Byrne was fully informed of the risks.
They also dispute that IVIg caused her stroke.
Central London County Court heard that Ms Dhillon-Byrne, chief marketing officer at the City of London base of an international software company, turned to private treatment after the NHS was unable to fund her IVF in 2014.
She had an unsuccessful attempt at another London clinic before choosing ARGC. She told the court she had been trying to have a child since 2014.
She said she selected ARGC after a friend recommended it, praising what they described as high success rates.
The clinic’s website describes its approach as “IVF boot camp” and promotes “in-depth investigations, daily monitoring and real-time treatment adjustments.”
Ms Dhillon-Byrne says she was not warned of the “specific” risks of thrombosis – blood clotting that can lead to stroke – in relation to the IVIg therapy.
She also says the clinic overstated her chances of success and failed to secure her “informed consent” before treatment began.
She argues that, had she been given a clear picture of her chance of a successful pregnancy, she would not have consented to IVF and the supplemental IVIg therapy.
Denying Ms Dhillon-Byrne’s claims, the clinic’s KC, Clodagh Bradley, told the court that the success rate advice given was “accurate and in accordance with the ARGC data.”
She added that Ms Dhillon-Byrne had been informed that the immune treatment was new and “still controversial.”
Lawyers said outside court that, if successful, Ms Dhillon-Byrne’s claim is likely to be worth “millions” due to the impact of the stroke on her high-flying career.
The trial continues.
Opinion
Automating inequality: When AI undervalues women’s care needs
By Morgan Rose, chief science officer at Ema
Artificial intelligence is supposed to make care smarter, faster, and fairer, but what happens when it quietly learns to see women as less in need?
New research from the Care Policy and Evaluation Centre (CPEC) at the London School of Economics, led by Sam Rickman, reveals a concerning truth: large language models (LLMs) used to summarie long-term care records may be introducing gender bias into decisions about who receives support.
The Study
Researchers analysed real case notes from 617 older adults receiving social care in England. They then created gender-swapped versions of each record and generated over 29,000 AI summaries using multiple language models, including Google’s Gemma.’
The goal was simple: would AI treat men’s and women’s needs the same way?
It didn’t.
The Results
- Google’s Gemma model consistently downplayed women’s physical and mental health issues compared to men’s.
- Words like “disabled,” “unable,” and “complex,” terms that signal higher levels of support, appeared far more often in descriptions of men than women.
- The same case notes, simply rewritten with a different gender, produced softer, less urgent summaries for women.
In other words, when the algorithm rewrote her story, her needs shrank.
The Cost of Softer Language
Language isn’t neutral. In healthcare, it’s the difference between monitor and act.
Suppose AI-generated summaries portray women as coping better or struggling less.
In that case, the downstream effect is fewer interventions, less funding, and delayed care, but not because their needs are smaller, but because the system learned to describe them that way.
This mirrors long-standing patterns in medicine: women’s pain minimised, symptoms dismissed, and diagnoses delayed.
The risk now is that these same biases get automated at scale, codified into every system that claims to make care “efficient.”
Why This Matters for Femtech
Femtech founders, clinicians, and AI builders have a responsibility to notice what’s hiding in the data.
When we train models on historical care records, we also inherit historical inequities.
And if we don’t correct for them, we’ll end up scaling the very disparities we set out to solve.
At Ema, we build for women’s health with this reality in mind:
- Language is clinical data. Every word shapes care pathways.
- Bias is not neutralised by scale. It’s magnified by it.
- Ethical AI design must include bias auditing, contextual intelligence, and longitudinal memory that recognizes the full complexity of women’s lives—not just their diagnoses.
The Path Forward
Fixing this isn’t about scrapping AI.
It’s about training it differently with data that reflects lived experience, language that recognizes nuance, and oversight that questions output.
Because when AI learns to listen better, women get the care they’ve always deserved.
Source:
-
News3 weeks agoDozens of women report suffering painful burns after using Always sanitary towels
-
Insight4 weeks agoWomen’s health innovations recognised in TIME’s Best Inventions 2025
-
Entrepreneur3 weeks agoCutting through the noise in femtech – key takeaways from Women’s Health Week 2025
-
News4 weeks agoMenstrual cycle affects women’s reaction time, study finds
-
News2 weeks agoAI embryo selection tool wins European approval
-
Menopause2 weeks agoTestosterone patch shows promise for menopausal women
-
News3 weeks agoScientists develop breakthrough approach to detecting endometriosis in menstrual blood
-
Insight2 weeks agoFrom SEO to GEO: How women’s health brands can get found in the age of AI








5 Comments