News
Stockholm-based birth control app raises US$7m ahead of partnership with Samsung
The start-up Natural Cycles will partner with Samsung in groundbreaking collaboration
The Swedish start-up Natural Cycles has secured US$7m in funding to advance its contraceptive app, ahead of partnership with Samsung.
The women’s health company based in Stockholm has developed the world’s first certified app that uses body temperature and other key fertility indicators to determine each user’s unique fertility status.
Dr Elina Berglund and her husband, Dr Raoul Scherwitzl, were in search of an effective method of natural birth control when they founded Natural Cycles in 2013.
Five years later the app became the first ever digital birth control to be cleared by the US FDA and it is now certified as a contraceptive in Europe, Australia and Singapore.
Through the US$7m funding round, led by Samsung Ventures, Natural Cycles’ fertility technology will support the cycle tracking feature integrated into Samsung’s watches, marking the first time the app’s algorithm has been adapted for a smartwatch.
According to Samsung, the new skin temperature-based cycle tracking capabilities will be available through the Samsung Health app on the Galaxy Watch5 and Watch5 Pro by the end of June.
The giant has confirmed that all data collected will be encrypted and stored on the user’s device itself, giving users more control of their health data and “better peace of mind”.
Hon Pak, vice president and head of digital health team, MX Business at Samsung Electronics, said: “Consumers can now easily track their menstrual cycle right from their wrist, combining Natural Cycles’ innovative fertility technology with Samsung’s superior temperature sensor to provide a more holistic understanding of their health and wellbeing.
“This is another demonstration of Samsung’s open collaboration philosophy with other industry leaders to create better health experiences.”
Dr Scherwitzl, Natural Cycles founder, said: “We are excited to partner with Samsung to deliver a premium experience that pushes women’s health forward.
“The Natural Cycles app has helped millions of women around the world take control of their fertility and this partnership will allow Samsung to leverage our fertility technology to offer temperature-based cycle tracking through a smartwatch for the first time.”
In an interview with Forbes, Berglund said: “I had concerns that women’s health innovation could suffer a setback but with the growth we have seen at the company – as well as the support we’re seeing from partners like Samsung – it’s evident that’s not the case.
“I can feel a shift as women demand access to more high-quality products and I’m confident more companies will follow the same path and we’ll see further investment within women’s health this year.”
News
FDA removes warning label from menopause drugs
News
Woman files lawsuit claiming fertility clinic ‘bootcamp’ caused her stroke
A London executive is suing a fertility clinic, alleging its IVF treatment led to her suffering a stroke.
Navkiran Dhillon-Byrne, 51, began private IVF treatment at the Assisted Reproduction and Gynaecology Centre (ARGC) in Wimpole Street, London, in April 2018.
Ten days after her treatment ended, on 28 April 2018, she suffered a stroke, which her lawyers say has left her with ongoing vision problems.
Ms Dhillon-Byrne is now suing the clinic and its head, Mohamed Taranissi, for negligence and breach of duty, saying medics failed to give her sufficient warnings about stroke risks linked to IVIg immunotherapy (intravenous immunoglobulin) – a one-off add-on treatment designed to moderate the body’s immune responses during pregnancy.
The clinic and Dr Taranissi deny liability, saying Ms Dhillon-Byrne was fully informed of the risks.
They also dispute that IVIg caused her stroke.
Central London County Court heard that Ms Dhillon-Byrne, chief marketing officer at the City of London base of an international software company, turned to private treatment after the NHS was unable to fund her IVF in 2014.
She had an unsuccessful attempt at another London clinic before choosing ARGC. She told the court she had been trying to have a child since 2014.
She said she selected ARGC after a friend recommended it, praising what they described as high success rates.
The clinic’s website describes its approach as “IVF boot camp” and promotes “in-depth investigations, daily monitoring and real-time treatment adjustments.”
Ms Dhillon-Byrne says she was not warned of the “specific” risks of thrombosis – blood clotting that can lead to stroke – in relation to the IVIg therapy.
She also says the clinic overstated her chances of success and failed to secure her “informed consent” before treatment began.
She argues that, had she been given a clear picture of her chance of a successful pregnancy, she would not have consented to IVF and the supplemental IVIg therapy.
Denying Ms Dhillon-Byrne’s claims, the clinic’s KC, Clodagh Bradley, told the court that the success rate advice given was “accurate and in accordance with the ARGC data.”
She added that Ms Dhillon-Byrne had been informed that the immune treatment was new and “still controversial.”
Lawyers said outside court that, if successful, Ms Dhillon-Byrne’s claim is likely to be worth “millions” due to the impact of the stroke on her high-flying career.
The trial continues.
News
Automating inequality: When AI undervalues women’s care needs
By Morgan Rose, chief science officer at Ema
Artificial intelligence is supposed to make care smarter, faster, and fairer, but what happens when it quietly learns to see women as less in need?
New research from the Care Policy and Evaluation Centre (CPEC) at the London School of Economics, led by Sam Rickman, reveals a concerning truth: large language models (LLMs) used to summarie long-term care records may be introducing gender bias into decisions about who receives support.
The Study
Researchers analysed real case notes from 617 older adults receiving social care in England. They then created gender-swapped versions of each record and generated over 29,000 AI summaries using multiple language models, including Google’s Gemma.’
The goal was simple: would AI treat men’s and women’s needs the same way?
It didn’t.
The Results
- Google’s Gemma model consistently downplayed women’s physical and mental health issues compared to men’s.
- Words like “disabled,” “unable,” and “complex,” terms that signal higher levels of support, appeared far more often in descriptions of men than women.
- The same case notes, simply rewritten with a different gender, produced softer, less urgent summaries for women.
In other words, when the algorithm rewrote her story, her needs shrank.
The Cost of Softer Language
Language isn’t neutral. In healthcare, it’s the difference between monitor and act.
Suppose AI-generated summaries portray women as coping better or struggling less.
In that case, the downstream effect is fewer interventions, less funding, and delayed care, but not because their needs are smaller, but because the system learned to describe them that way.
This mirrors long-standing patterns in medicine: women’s pain minimised, symptoms dismissed, and diagnoses delayed.
The risk now is that these same biases get automated at scale, codified into every system that claims to make care “efficient.”
Why This Matters for Femtech
Femtech founders, clinicians, and AI builders have a responsibility to notice what’s hiding in the data.
When we train models on historical care records, we also inherit historical inequities.
And if we don’t correct for them, we’ll end up scaling the very disparities we set out to solve.
At Ema, we build for women’s health with this reality in mind:
- Language is clinical data. Every word shapes care pathways.
- Bias is not neutralised by scale. It’s magnified by it.
- Ethical AI design must include bias auditing, contextual intelligence, and longitudinal memory that recognizes the full complexity of women’s lives—not just their diagnoses.
The Path Forward
Fixing this isn’t about scrapping AI.
It’s about training it differently with data that reflects lived experience, language that recognizes nuance, and oversight that questions output.
Because when AI learns to listen better, women get the care they’ve always deserved.
Source:
-
News3 weeks agoDozens of women report suffering painful burns after using Always sanitary towels
-
News4 weeks agoWomen’s health innovations recognised in TIME’s Best Inventions 2025
-
Wellness3 weeks agoCutting through the noise in femtech – key takeaways from Women’s Health Week 2025
-
News4 weeks agoMenstrual cycle affects women’s reaction time, study finds
-
News2 weeks agoAI embryo selection tool wins European approval
-
News3 weeks agoScientists develop breakthrough approach to detecting endometriosis in menstrual blood
-
Wellness2 weeks agoTestosterone patch shows promise for menopausal women
-
Features2 weeks agoFrom SEO to GEO: How women’s health brands can get found in the age of AI






