This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.

Freshfields Risk & Compliance

| 5 minute read
Reposted from Freshfields Technology Quotient

UK Government responds to Whitehead Review’s findings of systemic bias in UK medical devices

In March, the independent review into equity in medical devices chaired by Professor Dame Margaret Whitehead published its long-awaited final report, making wide-ranging recommendations to tackle existing and future risks of bias in medical devices on the UK market (the Report). At the same time, the UK government issued a policy paper in response, addressing the recommendations. Here, we examine some of the themes that emerged from the Report and the UK government’s response (the Response). 

The equity in medical devices review: brief background and scope 

In 2021, the Department of Health and Social Care (DHSC) commissioned an independent ‘equity in medical devices’ review into potential systemic bias in medical devices on the UK market (see our earlier commentary here). The initial driver for the review was concern around pulse oximeters, which were widely used during the COVID-19 Pandemic to measure blood oxygen saturation, and their accuracy in patients with darker skin tones. 

Once commissioned, the scope of the review was expanded beyond oximeters to encompass three categories of medical devices which “may be particularly prone to racial, ethnic or other unfair biases”: (1) optical medical devices, including oximeters, which send light waves through patients’ skin to measure underlying physiology (and can perform differently on different skin colours); (2) artificial intelligence (AI)-enabled medical devices; and (3) polygenic risk scores (PRS) in genomics (a statistical way of measuring the risk of developing disease based on genomic data). The review focussed on the potential for ethnic and racial inequities, which the panel considered were more likely to arise in the devices under review. 

The Report: some key findings

The Report was published on 11 March 2024. It concluded that there is evidence of unfair bias in medical devices in the UK which, although “largely unintentional”, pointed to a need to review the system as a whole. It found that differentiation in socio-economic conditions together with “systemic structural issues” further amplify this existing bias. The Report also identified potential future risks of bias as technology in the medical devices industry continues to develop, and urged “a renewed sense of urgency and commitment to address inequity…at the highest levels of government”. 

In relation to AI-enabled medical devices, the Report found that the evidence for adverse clinical impacts of biases is “patchy”, but that “existing biases and discrimination in society can unwittingly be incorporated at every stage of the lifecycle…and then magnified in algorithm development and machine learning”. It expressed some concern that the “exponential increase” in AI-driven applications in medical devices has “far surpassed” any increase in regulation of AI used to support clinical decision making and AI-derived predictive analytics. 

With respect to PRS, the Report found that the data sources on which PRS draw have a “well-established bias against groups with non-European genetic ancestry” and were concerned by the potential for misinterpretation of results by the public and health professionals alike, especially in relation to genetic determinism, which may carry wider societal risks”. 

Although out of scope, the Report included a call to action to review the equity of medical devices encountered during pregnancy and the neonatal period, as part of the wider investigations of health outcomes for ethnic minorities and poorer women and their babies. 

The Report made 18 specific recommendations, 51 sub-recommendations, and three further “calls to action” to the UK government. 

What is the UK government going to do in response?

Overall, the UK government welcomed the Report and its recommendations, indicating in the Response that work to address many of the identified inequities is already underway. The Response referred in particular to existing or planned work to be carried out as part of: the UK’s inaugural MedTech Strategy, published in February 2023 (see further here); the anticipated new UK regulatory framework for regulatory devices (see further here); and the MHRA’s Software and AI as a Medical Device Change Programme with respect to AI. Some highlights from the Response include: 

  • Optical medical devices (including oximeters):
    • The UK government refers to existing work by the MHRA, including the recent introduction of a new validation process for clinical investigations, requiring applicants to demonstrate how they intend to address bias. 
    • Further investigation is planned by the MHRA into whether there is evidence of inaccuracy in different skin tones for other optical devices and it will propose “relevant regulatory action”.
    • There is work underway to ensure diverse skin tones are included in data imaging banks to mitigate the risk of racial bias in datasets.
    • There will be consideration of how to align post-market surveillance approaches currently under development by MHRA with monitoring optical devices in real world applications.
  • AI-enabled medical devices:
    • The Response expressed a commitment to continue to work to improve the transparency of data in AI-enabled devices, in an effort to improve safety and trustworthiness of AI products that influence clinical decisions.
    • The Response points to a range of measures to support regulators in addressing the risks and challenges posed by new AI technologies, including guidance on regulatory principles for AI. It referred to ongoing and planned work related to the MHRA’s Software and AI as a Medical Device Change Programme. The Report had explicitly noted that the panel had been impressed by the initiatives on equity and AI which were already underway. 
    • It is clear from the response that the UK government sees an important role for guidance and soft law with respect to AI regulation, versus hard law or regulation: 
      • It emphasised the importance of an “agile” approach to software and AI, with the MHRA intending to address most AI and software-related requirements in regulatory guidance. 
      • It indicated that the Report’s recommendation that manufacturers should be required to report the diversity of data used to train algorithms would be better captured in guidance, stating that there are “no current plans to implement this as a legal requirement”.
  • Call for action with respect to medical devices encountered during pregnancy and neonatal period: The UK government noted the ongoing independent review of cases concern in maternity services, but recognised that further research is needed, and that the DHSC would be taking this forward. 

Comment

  • Most of the UK government’s planned action in response refers to existing programmes of work or planned regulatory changes. That is not surprising given that the review’s findings were presented to the UK government in June 2023 and the review has been ongoing since 2021.
  • We will need to see concrete drafting around the new UK medical devices regulations (not expected until next year based on the latest Roadmap) and related guidance – which is due to play a critical role with respect to AI and software in medical devices in particular – to see if and precisely how these recommendations will be implemented. For more insights generally into the evolving landscape around AI regulation please see here
  • The MHRA separately welcomed the review, stating that “more needs to be done to address inequalities in relation to the regulation of medical devices”. 

Tags

ai, life sciences, regulatory