News
FDA approves first over-the-counter birth control pill
Opill may help reduce the number of unintended pregnancies and reduce barriers to accessing contraceptives
The US Food and Drug Administration has approved the country’s first-ever daily hormonal contraceptive pill for sale without a prescription in a move that could dramatically change access to birth control.
Opill, the progestin-only pill from drugmaker Perrigo, will provide an option for obtaining oral contraceptives without needing to first see a healthcare provider, in hopes of reducing barriers to access.
Almost half of the over six million pregnancies in the US each year are unintended, according to the FDA’s news release.
Unintended pregnancies have been linked to negative maternal and perinatal outcomes, including reduced likelihood of receiving early prenatal care and increased risk of preterm delivery, with associated adverse neonatal, developmental and child health outcomes.
Availability of nonprescription Opill, the FDA said, may help reduce the number of unintended pregnancies and their potential negative impacts.
“Today’s approval marks the first time a nonprescription daily oral contraceptive will be an available option for millions of people in the United States,” said Dr Patrizia Cavazzoni, director of the FDA’s Center for Drug Evaluation and Research.
“When used as directed, daily oral contraception is safe and is expected to be more effective than currently available nonprescription contraceptive methods in preventing unintended pregnancy.”
The contraceptive efficacy of Opill (norgestrel) was established with the original approval for prescription use in 1973. HRA Pharma applied to switch norgestrel from a prescription to an over-the-counter product.
For approval of a product for use in the nonprescription setting, the FDA requires that the applicant demonstrate that the product can be used by consumers safely and effectively, relying only on the nonprescription drug labeling without any assistance from a health care professional.
Studies showed that consumer understanding of information on the Opill Drug Facts label was high overall and that a high proportion of consumers understood the label instructions, supporting their ability to properly use the drug when it is available as an over-the-counter product.
The FDA concluded Opill is “safe and effective” when used properly. The guidelines include taking the pill at the same time every day; not using it along with another hormonal birth control product, including intra-uterine devices (IUDs); and avoiding medications that interact with it.
However, the agency warned Opill should not be used by those who have or have ever had breast cancer.
“Consumers who have any other form of cancer should ask a doctor before use,” it added.
The most common side effects of Opill include irregular bleeding, headaches, dizziness, nausea, increased appetite, abdominal pain, cramps or bloating.
Use of Opill may also be associated with changes in vaginal bleeding patterns, such as irregular spotting and prolonged bleeding.
The pill is expected to go on sale at major retailers early next year with no age restrictions on sales.
Wellness
FDA removes warning label from menopause drugs
News
Woman files lawsuit claiming fertility clinic ‘bootcamp’ caused her stroke
A London executive is suing a fertility clinic, alleging its IVF treatment led to her suffering a stroke.
Navkiran Dhillon-Byrne, 51, began private IVF treatment at the Assisted Reproduction and Gynaecology Centre (ARGC) in Wimpole Street, London, in April 2018.
Ten days after her treatment ended, on 28 April 2018, she suffered a stroke, which her lawyers say has left her with ongoing vision problems.
Ms Dhillon-Byrne is now suing the clinic and its head, Mohamed Taranissi, for negligence and breach of duty, saying medics failed to give her sufficient warnings about stroke risks linked to IVIg immunotherapy (intravenous immunoglobulin) – a one-off add-on treatment designed to moderate the body’s immune responses during pregnancy.
The clinic and Dr Taranissi deny liability, saying Ms Dhillon-Byrne was fully informed of the risks.
They also dispute that IVIg caused her stroke.
Central London County Court heard that Ms Dhillon-Byrne, chief marketing officer at the City of London base of an international software company, turned to private treatment after the NHS was unable to fund her IVF in 2014.
She had an unsuccessful attempt at another London clinic before choosing ARGC. She told the court she had been trying to have a child since 2014.
She said she selected ARGC after a friend recommended it, praising what they described as high success rates.
The clinic’s website describes its approach as “IVF boot camp” and promotes “in-depth investigations, daily monitoring and real-time treatment adjustments.”
Ms Dhillon-Byrne says she was not warned of the “specific” risks of thrombosis – blood clotting that can lead to stroke – in relation to the IVIg therapy.
She also says the clinic overstated her chances of success and failed to secure her “informed consent” before treatment began.
She argues that, had she been given a clear picture of her chance of a successful pregnancy, she would not have consented to IVF and the supplemental IVIg therapy.
Denying Ms Dhillon-Byrne’s claims, the clinic’s KC, Clodagh Bradley, told the court that the success rate advice given was “accurate and in accordance with the ARGC data.”
She added that Ms Dhillon-Byrne had been informed that the immune treatment was new and “still controversial.”
Lawyers said outside court that, if successful, Ms Dhillon-Byrne’s claim is likely to be worth “millions” due to the impact of the stroke on her high-flying career.
The trial continues.
Diagnosis
Automating inequality: When AI undervalues women’s care needs
By Morgan Rose, chief science officer at Ema
Artificial intelligence is supposed to make care smarter, faster, and fairer, but what happens when it quietly learns to see women as less in need?
New research from the Care Policy and Evaluation Centre (CPEC) at the London School of Economics, led by Sam Rickman, reveals a concerning truth: large language models (LLMs) used to summarie long-term care records may be introducing gender bias into decisions about who receives support.
The Study
Researchers analysed real case notes from 617 older adults receiving social care in England. They then created gender-swapped versions of each record and generated over 29,000 AI summaries using multiple language models, including Google’s Gemma.’
The goal was simple: would AI treat men’s and women’s needs the same way?
It didn’t.
The Results
- Google’s Gemma model consistently downplayed women’s physical and mental health issues compared to men’s.
- Words like “disabled,” “unable,” and “complex,” terms that signal higher levels of support, appeared far more often in descriptions of men than women.
- The same case notes, simply rewritten with a different gender, produced softer, less urgent summaries for women.
In other words, when the algorithm rewrote her story, her needs shrank.
The Cost of Softer Language
Language isn’t neutral. In healthcare, it’s the difference between monitor and act.
Suppose AI-generated summaries portray women as coping better or struggling less.
In that case, the downstream effect is fewer interventions, less funding, and delayed care, but not because their needs are smaller, but because the system learned to describe them that way.
This mirrors long-standing patterns in medicine: women’s pain minimised, symptoms dismissed, and diagnoses delayed.
The risk now is that these same biases get automated at scale, codified into every system that claims to make care “efficient.”
Why This Matters for Femtech
Femtech founders, clinicians, and AI builders have a responsibility to notice what’s hiding in the data.
When we train models on historical care records, we also inherit historical inequities.
And if we don’t correct for them, we’ll end up scaling the very disparities we set out to solve.
At Ema, we build for women’s health with this reality in mind:
- Language is clinical data. Every word shapes care pathways.
- Bias is not neutralised by scale. It’s magnified by it.
- Ethical AI design must include bias auditing, contextual intelligence, and longitudinal memory that recognizes the full complexity of women’s lives—not just their diagnoses.
The Path Forward
Fixing this isn’t about scrapping AI.
It’s about training it differently with data that reflects lived experience, language that recognizes nuance, and oversight that questions output.
Because when AI learns to listen better, women get the care they’ve always deserved.
Source:
-
Hormonal health3 weeks agoDozens of women report suffering painful burns after using Always sanitary towels
-
News4 weeks agoWomen’s health innovations recognised in TIME’s Best Inventions 2025
-
News3 weeks agoCutting through the noise in femtech – key takeaways from Women’s Health Week 2025
-
Diagnosis4 weeks agoMenstrual cycle affects women’s reaction time, study finds
-
Fertility2 weeks agoAI embryo selection tool wins European approval
-
News2 weeks agoTestosterone patch shows promise for menopausal women
-
Diagnosis3 weeks agoScientists develop breakthrough approach to detecting endometriosis in menstrual blood
-
Features2 weeks agoFrom SEO to GEO: How women’s health brands can get found in the age of AI






