Connect with us

Features

How femtech can navigate the EU medical device and AI rules

Published

on

By Xisca Borrás, Partner – Life sciences regulatory and Ellie Handy, Senior Associate – Life sciences regulatory, Bristows

As femtech is intrinsically linked to health needs, a key question for femtech products is whether they are regulated as medical devices or merely consumer products.

Additionally, many femtech products are embracing the use of artificial intelligence (“AI”).

Therefore, another key question is whether products using AI will be regulated as “high-risk” AI systems under the EU’s new AI legal framework.

This article looks at when femtech apps and software qualify as medical devices in the EU and how the medical device and AI legal frameworks interact.

What is a software medical device?

The definition of “medical device” in the EU’s Medical Device Regulation 2017/745 (the “EU MDR”) includes software, used alone or in combination, that is intended by its legal manufacturer for a medical purpose.

These medical purposes are listed in the EU MDR and include (amongst others):

  • diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of disease;
  • diagnosis, monitoring, treatment, alleviation of, or compensation for, an injury or disability; and
  • control or support of conception.

The legal manufacturer is the person that puts their name/branding on the device, and takes responsibility for it.

Whether software is considered a medical device will depend on whether the manufacturer states it has a medical purpose in the relevant documentation/materials.

The EU MDR defines intended purpose as “the use for which a device is intended according to the data supplied by the manufacturer on the label, in the instructions for use or in promotional or sales materials or statements and as specified by the manufacturer in the clinical evaluation” [emphasis added].

What is the test for qualifying as a medical device in the EU?

There is a selection of guidance documents that can assist you in determining whether a product should qualify as a medical device.

We summarise some of the key guidance below:

  1. MDCG 2019-11 rev.1 

Under the EU MDR, the Medical Device Coordination Group (“MDCG”) has published guidance on the qualification and classification of software as a medical device.

It sets out five decision steps to help determine if a piece of software is a medical device in the EU. The steps are:

  • Step 1: Is the product software?
  • Step 2: Is it standalone software (e.,it is not an accessory nor driving/influencing the use of a hardware device) and does it not fall within Annex XVI[1]?
  • Step 3: Is it performing an action on data beyond storage, archival, communication, simple search or lossless compression?
  • Step 4: Does it act for the benefit of an individual patient?
  • Step 5: Does it have a medical purpose (as set out in the medical device definition)?

If the answer to all five questions is yes, it will qualify as a medical device.

In this case, manufacturers will have to ensure they comply with the pre-market requirements set out in the EU MDR before they can place the software medical device on the market.

Notably, they will need to set up a qualify management system, compile a technical file, undergo the appropriate conformity assessment and affix a CE mark.

Importantly, the manufacturers would also need to consider post-market requirements, such as having a post-market surveillance system and undertaking post-market vigilance.

  1. Other relevant guidance

The MDCG has also published a Manual on borderline and classification of medical devices under the EU MDR.

Additional sources of guidance may also be available from national competent authorities.

The legal manufacturer could also look at examples of other products already on the market to see how they are regulated (e.g. looking at EUDAMED).

Although, we would caution anyone relying too heavily on the regulation of other products as there is no guarantee they are compliant.

What if you’re not a medical device?

If the software does not qualify as a medical device, the product will not have to comply with the EU MDR.

However, the manufacturer should be careful about how it promotes its product and the claims it makes about it because, as discussed above, a medical device is defined based on the manufacturer’s intended purpose.

Let’s take the example of a mere period app.

Using it for logging period dates, tracking ovulation, and predicting future cycles has no medical purpose and is therefore not a medical device.

However, if its manufacturer recommends this piece of software for contraception and/or to support conception it will suddenly have a medical purpose and so, it would qualify as a medical device.

As such, the manufacturer would either have to bring the device into conformity with the EU MDR or take action to change the promotional materials to remove the medical claims.

Interaction between medical devices and AI legal frameworks 

Under the EU MDR, devices are assigned risk classifications.

For the lowest risk devices (Class I medical devices), the manufacturer can self-certify compliance with the EU MDR prior to the product being placed on the market or put into service in the EU.

However, high risk devices (Class IIa or above medical devices) must undergo a third party conformity assessment carried out by a notified body.

Notified body conformity assessments require a detailed review of the manufacturer’s quality management system, technical documentation, systems and procedures.

The process will often take more than a year to complete.

Additionally, manufacturers have to grapple with ongoing burdens such as vigilance and post-market surveillance.

Under the EU MDR, most software as a medical device will be classified as a Class IIa or above.

Like the EU MDR, the EU’s Regulation (EU) 2024/1689 (the “AI Act”) also distinguishes between AI systems that pose different levels of risk.

The AI Act imposes onerous obligations on “high risk” AI systems, including in relation to accuracy, transparency, risk management, data quality and governance, and human oversight.

Although there is some overlap between the EU MDR and AI Act requirements, many are new AI-specific obligations.

These pose a significant additional regulatory burden, increasing the complexity and cost of compliance for stakeholders.

Notably, the risk classification of an AI system that is itself, or is included in, a medical device is linked to the device’s classification under the EU MDR. Under the AI Act, AI systems are classified as “high risk” systems if:

(a) the AI system is a safety component of a medical device or the AI system itself is a medical device; and 
(b) the medical device is required to undergo a third-party conformity assessment under the EU MDR.

Therefore, low risk medical devices (i.e., Class I medical devices) that are self-certified cannot be “high risk” AI systems.

Whereas, any device that requires a notified body to perform its conformity assessment will be a “high risk” AI system, and so will be subject to the additional AI Act requirements.

Unfortunately for those wishing to avoid the “high risk” AI system requirements, there are relatively few Class I devices under the EU MDR.

Therefore, the majority of medical devices that are an AI system or have an AI system as a safety component will qualify as a “high risk” AI system.

One notable example of a Class I device is software intended to support conception by calculating the user’s fertility status based on a validated statistical algorithm.

If this kind of software medical device is also an AI system, it would not be classed as a “high risk” AI system, so it would not be subject to the more onerous requirements in the AI Act.

However, the manufacturers of these devices would need to carefully consider any product developments that add additional functionality, as this can impact the risk classification of the product under both the EU MDR and AI Act.

For example, if the manufacturer added functionality to the Class I device so it could also be used as a means of contraception, it would become a Class IIb medical device and would need a third party conformity assessment.

In turn, as the software is also an AI system, this would mean the AI system would be considered “high-risk” and be subject to additional regulatory requirements under the AI Act.

Whilst AI has the potential to provide tremendous benefits for femtech, it also triggers additional complexity that can be time-consuming and costly to navigate.

It is important to get it right in terms of compliance in order to maintain consumer trust, avoid regulatory penalties, and pave the way for long-term success and viability.

News

Don’t get lost – How femtech can navigate the EU medical device and AI rules

Published

on

By Xisca Borrás and Ellie Handy of the life sciences regulatory department at Bristows law firm

Femtech, short for female technology, is an important and fast growing sector. The EU is a key market for femtech, with five of the top 10 countries for femtech investment located in the EU.

Femtech products are developed for many areas of women’s health, such as menstrual health, pregnancy planning and monitoring, menopause and mental wellbeing.

As femtech is intrinsically linked to health needs, a key question for femtech products is whether they are regulated as medical devices or merely consumer products.

Additionally, many femtech products are embracing the use of artificial intelligence (“AI”). Therefore, another key question is whether products using AI will be regulated as “high-risk” AI systems under the EU’s new AI legal framework.

This article looks at when femtech apps and software qualify as medical devices in the EU and how the medical device and AI legal frameworks interact.

What is a software medical device?

The definition of “medical device” in the EU’s Medical Device Regulation 2017/745 (the “EU MDR”) includes software, used alone or in combination, that is intended by its legal manufacturer for a medical purpose. These medical purposes are listed in the EU MDR and include (amongst others):

  • diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of disease;
  • diagnosis, monitoring, treatment, alleviation of, or compensation for, an injury or disability; and
  • control or support of conception.

The legal manufacturer is the person that puts their name/branding on the device, and takes responsibility for it.

Whether software is considered a medical device will depend on whether the manufacturer states it has a medical purpose in the relevant documentation/materials.

The EU MDR defines intended purpose as “the use for which a device is intended according to the data supplied by the manufacturer on the label, in the instructions for use or in promotional or sales materials or statements and as specified by the manufacturer in the clinical evaluation”.

What is the test for qualifying as a medical device in the EU?

There is a selection of guidance documents that can assist you in determining whether a product should qualify as a medical device. We summarise some of the key guidance below:

  1. MDCG 2019-11 rev.1 

Under the EU MDR, the Medical Device Coordination Group (“MDCG”) has published guidance on the qualification and classification of software as a medical device. It sets out five decision steps to help determine if a piece of software is a medical device in the EU. The steps are:

  • Step 1: Is the product software?
  • Step 2: Is it standalone software (i.e., it is not an accessory nor driving/influencing the use of a hardware device) and does it not fall within Annex XVI?
  • Step 3: Is it performing an action on data beyond storage, archival, communication, simple search or lossless compression?
  • Step 4: Does it act for the benefit of an individual patient?
  • Step 5: Does it have a medical purpose (as set out in the medical device definition)?

If the answer to all five questions is yes, it will qualify as a medical device. In this case, manufacturers will have to ensure they comply with the pre-market requirements set out in the EU MDR before they can place the software medical device on the market.

Notably, they will need to set up a qualify management system, compile a technical file, undergo the appropriate conformity assessment and affix a CE mark.

Importantly, the manufacturers would also need to consider post-market requirements, such as having a post-market surveillance system and undertaking post-market vigilance.

3. Other relevant guidance

The MDCG has also published a manual on borderline and classification of medical devices under the EU MDR.

Additional sources of guidance may also be available from national competent authorities. The legal manufacturer could also look at examples of other products already on the market to see how they are regulated (e.g. looking at EUDAMED). Although, we would caution anyone relying too heavily on the regulation of other products as there is no guarantee they are compliant.

What if you’re not a medical device?

If the software does not qualify as a medical device, the product will not have to comply with the EU MDR.

However, the manufacturer should be careful about how it promotes its product and the claims it makes about it because, as discussed above, a medical device is defined based on the manufacturer’s intended purpose.

Let’s take the example of a mere period app. Using it for logging period dates, tracking ovulation, and predicting future cycles has no medical purpose and is therefore not a medical device.

However, if its manufacturer recommends this piece of software for contraception and/or to support conception it will suddenly have a medical purpose and so, it would qualify as a medical device.

As such, the manufacturer would either have to bring the device into conformity with the EU MDR or take action to change the promotional materials to remove the medical claims.

Interaction between medical devices and AI legal frameworks 

Under the EU MDR, devices are assigned risk classifications. For the lowest risk devices (Class I medical devices), the manufacturer can self-certify compliance with the EU MDR prior to the product being placed on the market or put into service in the EU.

However, high risk devices (Class IIa or above medical devices) must undergo a third party conformity assessment carried out by a notified body.

Notified body conformity assessments require a detailed review of the manufacturer’s quality management system, technical documentation, systems and procedures.

The process will often take more than a year to complete. Additionally, manufacturers have to grapple with ongoing burdens such as vigilance and post-market surveillance.

Under the EU MDR, most software as a medical device will be classified as a Class IIa or above.

Like the EU MDR, the EU’s Regulation (EU) 2024/1689 (the “AI Act”) also distinguishes between AI systems that pose different levels of risk.

The AI Act imposes onerous obligations on “high risk” AI systems, including in relation to accuracy, transparency, risk management, data quality and governance, and human oversight.

Although there is some overlap between the EU MDR and AI Act requirements, many are new AI-specific obligations. These pose a significant additional regulatory burden, increasing the complexity and cost of compliance for stakeholders.

Notably, the risk classification of an AI system that is itself, or is included in, a medical device is linked to the device’s classification under the EU MDR. Under the AI Act, AI systems are classified as “high risk” systems if:

(a) the AI system is a safety component of a medical device or the AI system itself is a medical device; and 
(b) the medical device is required to undergo a third-party conformity assessment under the EU MDR.

Therefore, low risk medical devices (i.e., Class I medical devices) that are self-certified cannot be “high risk” AI systems.

Whereas, any device that requires a notified body to perform its conformity assessment will be a “high risk” AI system, and so will be subject to the additional AI Act requirements.

Unfortunately for those wishing to avoid the “high risk” AI system requirements, there are relatively few Class I devices under the EU MDR.

Therefore, the majority of medical devices that are an AI system or have an AI system as a safety component will qualify as a “high risk” AI system.

One notable example of a Class I device is software intended to support conception by calculating the user’s fertility status based on a validated statistical algorithm.

If this kind of software medical device is also an AI system, it would not be classed as a “high risk” AI system, so it would not be subject to the more onerous requirements in the AI Act.

However, the manufacturers of these devices would need to carefully consider any product developments that add additional functionality, as this can impact the risk classification of the product under both the EU MDR and AI Act.

For example, if the manufacturer added functionality to the Class I device so it could also be used as a means of contraception, it would become a Class IIb medical device and would need a third party conformity assessment.

In turn, as the software is also an AI system, this would mean the AI system would be considered “high-risk” and be subject to additional regulatory requirements under the AI Act.

Whilst AI has the potential to provide tremendous benefits for femtech, it also triggers additional complexity that can be time-consuming and costly to navigate.

It is important to get it right in terms of compliance in order to maintain consumer trust, avoid regulatory penalties, and pave the way for long-term success and viability.

By Xisca Borrás, Partner – Life sciences regulatory and  Ellie Handy, Senior Associate – Life sciences regulatory at Bristows law firm.

Continue Reading

News

How digital twins are making clinical trials more inclusive

Published

on

For decades, women have been excluded from clinical research, but AI-powered digital twinning is helping make trials safer, more inclusive, and more representative of real patients.

For Karen Yeo, innovation is not just about technology, but how it is used to change things for the better.

Yeo (pictured top left) is senior vice president of client and regulatory strategy at Certara, a biosimulation company focused on “transforming drug development for good”, which since 2014, has supported 90 per cent of novel FDA drug approvals.

Yeo and her team are using AI to make scientific research more inclusive by creating digital twins, matching the characteristics of a real patient with their digital counterpart to predict optimal dosing.

Women, particularly pregnant women, have typically been excluded from clinical trials for decades.

In 1977, following the thalidomide crisis of the 1950s and 1960s, the US FDA introduced guidelines preventing women of childbearing age from participating in many studies to avoid possible risks to pregnancies.

This was eventually lifted by the FDA in 1993, and the same year, the National Institutes of Health issued its Revitalisation Act, which required the inclusion of women and minorities in federally funded clinical research.

But while these changes opened the door to more representative trials, pregnant women continued to be excluded due to safety and ethical concerns.

“[Researchers] fear that exposing expectant mothers and their babies to experimental medicines could cause harm, so the default has been to leave them out,” Yeo tells Femtech World.

While “well-intentioned”, policies which prevented them from participating in clinical trials have contributed to “major gaps” in our understanding of how treatments work in women, and slowed down access to potentially life-saving treatments, leaving them facing “more risk, not less”, says Yeo.

“The result is a healthcare system where women are underserved,” she adds. “And both mothers and infants miss out on advances that could have been better supported through careful, inclusive trial design.”

The evidence gap

Excluding women from trials has delayed access to lifesaving therapies, and without the right evidence, clinicians often prescribe medications off-label during pregnancy with little understanding of how those drugs behave in women’s bodies.

For years, drug dosing was determined largely from small studies in men, even though women may metabolise and respond to drugs differently, Yeo explains.

“This gap has contributed to higher rates of adverse drug reactions in women and less clarity about the effectiveness of medicines in conditions unique to them, such as pregnancy and postpartum care,” she says.

Yeo points to the example of the antimalarial drug primaquine.

Pregnant and lactating women have typically been excluded from primaquine studies, leaving physicians without the right guidance.

Using biosimulation, Yeo and her team took the findings from a small clinical study conducted in breastfeeding women and were able to predict infant and newborn drug exposures through breast milk.

“The results provided evidence of safe dosing of primaquine in mothers, showing how model-informed methods can begin to close those gaps,” she adds.

Digital twins

By creating virtual representations of women’s physiology, including during pregnancy, biosimulation can predict how a drug will behave in these populations without exposing patients to any unnecessary risk.

This allows regulators and clinicians to access insights early and encourages trial sponsors to include women more confidently.

“Models can estimate how a drug transfers through breast milk, giving trial designers a foundation for guidance before patients are enrolled,” Yeo explains.

“This reduces ethical concerns by minimising direct risk while still advancing inclusive research.

“Over time, it shifts the approach from protecting women from research to protecting them through research.”

These digital twins start as virtual populations, representing typical physiological scenarios. But as more personalised clinical data become available in pregnant women, these can be replaced by digital twins.

These are digital replicas of individual patients reflecting real-time individual physiology.

Yeo says these AI-powered digital twins can provide a safe way to generate evidence in individuals that are difficult to study directly, such as pregnant women with preeclampsia.

While no model has yet perfectly captured human biology, Yeo says biosimulation has advanced to a point where it can reflect many of the complexities of women’s health.

“Virtual populations can account for physiological changes during pregnancy, such as altered metabolism, which can affect the exposure of the drug across each of the trimesters,” she says.

“Rather than oversimplifying, they provide a framework that can help inform decision-making for clinicians.

“As real-world and clinical trial results with personalised data become increasingly available, virtual populations can evolve into more complex digital twins.”

“Safeguards start with data”

But as with any AI-driven technology, poorly designed models or those that rely on narrow datasets risk reinforcing existing biases, rather than correcting them.

Yeo says “safeguards start with data”.

Certara emphasises reproducibility and clear documentation, allowing every assumption to be reviewed, and works closely with regulators, clinicians, and patient advocates.

“Trust grows through representation and openness,” she continues.

“Patients need to see that the data behind digital twins includes people like them, and that models continue to be refined with clinical and real-world evidence.

“Clear communication about how the models work and why they matter helps show that these tools are designed with patients in mind.

“By involving women directly, researchers can ensure the models address concerns that matter most, from pregnancy safety to postpartum care.

“When women see that their priorities influence science, they are more likely to trust and benefit from it.”

Accelerating innovation

Having more inclusive datasets will also accelerate innovation in femtech, ensuring developers design new diagnostic tools and therapies for women’s health based on representative science, while data and insights collected through biosimulation can potentially shorten the timeline to regulatory approval.

“Femtech innovation depends on evidence that reflects women’s realities,” says Yeo.

“Digital twins make this possible by generating early data where traditional studies fall short.”

According to Yeo, AI is already having an impact on diversity in clinical trials, with regulators encouraging trial designs that use biosimulation to fill data gaps.

She believes digital twins could become a standard part of trial planning to ensure underrepresented groups are included within the next five years.

“Inclusivity will take time, but the momentum is here. Each step toward more representative datasets and more confident dosing guidance brings us closer to equitable clinical research.”

So what does success look like? That depends on the stakeholder, says Yeo.

For Yeo and her team, it’s being able to generate virtual subjects that reflect the characteristics of the patients receiving the medicine, giving a more realistic view of how therapies perform in practice.

For regulators, it’s the ability to approve drugs with confidence that dosing and safety information work across diverse groups.

And for patients, it is the assurance that the evidence was generated with their needs in mind.

“Success is not about technology alone,” adds Yeo.

“But about the trust and outcomes it creates for people who were once excluded.”

Continue Reading

Features

North London NHS Foundation Trust partners with Psyomics to transform mental health support for 1.6 million residents

Published

on

North London NHS Foundation Trust has signed a new partnership with leading digital health company Psyomics to revolutionise how people across Barnet, Camden, Enfield, Haringey, Islington access mental health care.

The collaboration will see Psyomics’ digital technology rolled out across community services, enabling access to the right support more easily for over 1.6 million residents.

Jess Lievesley is Chief Operating Officer from North London NHS Foundation Trust.

She said: “Our collaboration with Psyomics represents a significant advancement in how we support mental health across the communities of North Central London.

“By combining NLFT’s clinical expertise with Psyomics’ cutting-edge digital innovation, we are making it easier for individuals to access timely care and creating a more personalised care experience.”

Developed with clinical psychologists, psychiatrists and researchers at the University of Cambridge, the Psyomics Platform provides a digital front door for adults aged 18–65 seeking support.

The system captures patient-reported information – from symptoms to social and personal circumstances – giving clinicians holistic insights to inform the most appropriate care pathway.

With 25 per cent of submissions completed outside normal office hours, it improves accessibility and ensures support is available when people need it most, while enabling faster, more accurate decision-making from the outset.

At the same time, referrals are streamlined, meaning patients no longer have to repeat their story multiple times, improving both patient engagement and experience.

By combining clinical expertise with advanced technology, the platform standardises care and reduces administrative burden, ensuring timely, personalised support.

The initiative is a core part of the Trust’s wider transformation programme, with plans to extend the platform beyond adult services to include Talking Therapies, Neurodiversity, and Older Adults in the near future.

Dr Melinda Rees, Chief Executive at Psyomics, said: “Our mission is to make mental health services more accessible, efficient and patient-centred.

“Working with North London NHS Foundation Trust means 1.6 million people will benefit from a simpler, faster route to the care they need.

“By combining clinical expertise with cutting-edge technology, we can help shape the future of mental health care, making it faster, more efficient, and more patient-centred.

As demand for mental health services continues to rise, the North London partnership is set to become a national model for modernising NHS mental health pathways, offering a scalable solution that other Trusts may follow.

Continue Reading

Trending

Copyright © 2025 Aspect Health Media Ltd. All Rights Reserved.