News
Berlin-based period tracker app bags €7m investment ahead of launching community funding round
Clue has extended an invite to its user community to become investors via Crowdcube
The female-founded period tracker app Clue has secured a €7m funding round to bridge the gender data gap.
The funds will be used to scale Clue’s digital family planning offering, expand the product portfolio, and continue research into women’s reproductive health.
Clue, a Berlin-based period and cycle tracking app, was founded in 2012 by Danish female entrepreneur, Ida Tin. The app calculates and predicts users’ periods and PMS based on users’ data, calculating fertility windows based on global averages.
More than 10 million people in 190 countries rely on the app every month to better understand their own menstrual cycle patterns and learn about their reproductive health.
Sine the app’s launch, users have tracked over 530 million cycles and contributed to “groundbreaking” research, the company says.
A dataset of over 13B de-identified data points supports studies such as investigating how the COVID-19 infection and vaccines would impact the menstrual cycle.
“So many of us still end up hacking our own solutions to health needs. Despite making up half the world’s population, the most common female health conditions still go unrecognised, under-researched, and underserved,” said Audrey Tsang, co-CEO of Clue.
“We constantly hear from our community that they feel their experiences are unheard or dismissed – except in Clue.
“We created Clue because health empowerment starts with better understanding your body. Having the language, and data, to describe your experience and advocate for yourself has a profound impact.
“The fact that it still takes an average of seven years to get an endometriosis diagnosis is just one example of how much work there still is to do in this space.”
“We believe Clue is uniquely positioned to lead the change that’s needed, at scale, by leveraging our technology, deep community insights, and research to create empathetic, effective, and accessible solutions for the most frequently experienced challenges in female health.”
Its latest funding round, hoped to drive more research into women’s menstrual health, has been led by existing investors, Balderton Capital and Union Square Ventures.
However, in a move to bring its community closer to the product development process, the company has also extended an invite to its user community to become investors via Crowdcube.
Community investors will be able to participate in and influence the app’s development through feature polls, testing, and forums directly with the Clue team.
Carrie Walter, co-CEO of Clue, said: “Clue has always been supported by some of the world’s leading investors. Today, we’re taking that one step further, by being the first menstrual and reproductive health app to invite our community to invest and become co-owners.
“After all, Clue only exists to serve this community, and because of their trust. So we see this as an exciting chance to connect with them in another way, benefitting from the depth of their engagement and diverse perspectives.
“We feel it closes an important circle to give Clue users the opportunity to participate in our success, also as investors.”
News
FDA removes warning label from menopause drugs
News
Woman files lawsuit claiming fertility clinic ‘bootcamp’ caused her stroke
A London executive is suing a fertility clinic, alleging its IVF treatment led to her suffering a stroke.
Navkiran Dhillon-Byrne, 51, began private IVF treatment at the Assisted Reproduction and Gynaecology Centre (ARGC) in Wimpole Street, London, in April 2018.
Ten days after her treatment ended, on 28 April 2018, she suffered a stroke, which her lawyers say has left her with ongoing vision problems.
Ms Dhillon-Byrne is now suing the clinic and its head, Mohamed Taranissi, for negligence and breach of duty, saying medics failed to give her sufficient warnings about stroke risks linked to IVIg immunotherapy (intravenous immunoglobulin) – a one-off add-on treatment designed to moderate the body’s immune responses during pregnancy.
The clinic and Dr Taranissi deny liability, saying Ms Dhillon-Byrne was fully informed of the risks.
They also dispute that IVIg caused her stroke.
Central London County Court heard that Ms Dhillon-Byrne, chief marketing officer at the City of London base of an international software company, turned to private treatment after the NHS was unable to fund her IVF in 2014.
She had an unsuccessful attempt at another London clinic before choosing ARGC. She told the court she had been trying to have a child since 2014.
She said she selected ARGC after a friend recommended it, praising what they described as high success rates.
The clinic’s website describes its approach as “IVF boot camp” and promotes “in-depth investigations, daily monitoring and real-time treatment adjustments.”
Ms Dhillon-Byrne says she was not warned of the “specific” risks of thrombosis – blood clotting that can lead to stroke – in relation to the IVIg therapy.
She also says the clinic overstated her chances of success and failed to secure her “informed consent” before treatment began.
She argues that, had she been given a clear picture of her chance of a successful pregnancy, she would not have consented to IVF and the supplemental IVIg therapy.
Denying Ms Dhillon-Byrne’s claims, the clinic’s KC, Clodagh Bradley, told the court that the success rate advice given was “accurate and in accordance with the ARGC data.”
She added that Ms Dhillon-Byrne had been informed that the immune treatment was new and “still controversial.”
Lawyers said outside court that, if successful, Ms Dhillon-Byrne’s claim is likely to be worth “millions” due to the impact of the stroke on her high-flying career.
The trial continues.
Wellness
Automating inequality: When AI undervalues women’s care needs
By Morgan Rose, chief science officer at Ema
Artificial intelligence is supposed to make care smarter, faster, and fairer, but what happens when it quietly learns to see women as less in need?
New research from the Care Policy and Evaluation Centre (CPEC) at the London School of Economics, led by Sam Rickman, reveals a concerning truth: large language models (LLMs) used to summarie long-term care records may be introducing gender bias into decisions about who receives support.
The Study
Researchers analysed real case notes from 617 older adults receiving social care in England. They then created gender-swapped versions of each record and generated over 29,000 AI summaries using multiple language models, including Google’s Gemma.’
The goal was simple: would AI treat men’s and women’s needs the same way?
It didn’t.
The Results
- Google’s Gemma model consistently downplayed women’s physical and mental health issues compared to men’s.
- Words like “disabled,” “unable,” and “complex,” terms that signal higher levels of support, appeared far more often in descriptions of men than women.
- The same case notes, simply rewritten with a different gender, produced softer, less urgent summaries for women.
In other words, when the algorithm rewrote her story, her needs shrank.
The Cost of Softer Language
Language isn’t neutral. In healthcare, it’s the difference between monitor and act.
Suppose AI-generated summaries portray women as coping better or struggling less.
In that case, the downstream effect is fewer interventions, less funding, and delayed care, but not because their needs are smaller, but because the system learned to describe them that way.
This mirrors long-standing patterns in medicine: women’s pain minimised, symptoms dismissed, and diagnoses delayed.
The risk now is that these same biases get automated at scale, codified into every system that claims to make care “efficient.”
Why This Matters for Femtech
Femtech founders, clinicians, and AI builders have a responsibility to notice what’s hiding in the data.
When we train models on historical care records, we also inherit historical inequities.
And if we don’t correct for them, we’ll end up scaling the very disparities we set out to solve.
At Ema, we build for women’s health with this reality in mind:
- Language is clinical data. Every word shapes care pathways.
- Bias is not neutralised by scale. It’s magnified by it.
- Ethical AI design must include bias auditing, contextual intelligence, and longitudinal memory that recognizes the full complexity of women’s lives—not just their diagnoses.
The Path Forward
Fixing this isn’t about scrapping AI.
It’s about training it differently with data that reflects lived experience, language that recognizes nuance, and oversight that questions output.
Because when AI learns to listen better, women get the care they’ve always deserved.
Source:
-
Wellness3 weeks agoDozens of women report suffering painful burns after using Always sanitary towels
-
News4 weeks agoWomen’s health innovations recognised in TIME’s Best Inventions 2025
-
News3 weeks agoCutting through the noise in femtech – key takeaways from Women’s Health Week 2025
-
Adolescent health4 weeks agoMenstrual cycle affects women’s reaction time, study finds
-
Fertility in Focus2 weeks agoAI embryo selection tool wins European approval
-
News2 weeks agoTestosterone patch shows promise for menopausal women
-
News3 weeks agoScientists develop breakthrough approach to detecting endometriosis in menstrual blood
-
Features2 weeks agoFrom SEO to GEO: How women’s health brands can get found in the age of AI







Pingback: Can period trackers support better healthcare? - FemTech World