top of page
  • Lloyd Price

Preventing bias in AI-assisted Medical Devices: Department of Health and Social Care independent review



Exec Summary:


The Department of Health and Social Care (DHSC) commissioned an independent review on equity in medical devices, which concluded in March 2024. Here's a summary of the key points:


Background:


  • The review was triggered by concerns about racial bias in pulse oximeters, a device used to measure blood oxygen levels. Inaccuracies in readings for patients with darker skin tones could lead to delayed or missed critical care.

  • The scope was broadened to encompass a wider range of medical devices and potential biases, including those based on sex and socio-economic background.

Review Process:


  • A panel of experts in healthcare inequalities and AI diagnostics led the review.

  • They considered academic research, engaged with stakeholders (patients, healthcare professionals, manufacturers) and held a public call for evidence.


Outcomes:


  • The final report, published on March 11, 2024, details the findings and recommendations to ensure fairer medical devices


Key Recommendations


  • Improving Pulse Oximeters: The review calls for immediate actions to ensure existing pulse oximeters work for all patients. This includes clearer guidance for healthcare professionals and patients on using the devices with different skin tones. Additionally, it recommends strengthening approval standards for new pulse oximeters to guarantee accuracy across diverse skin tones.

  • Addressing Bias in AI Medical Devices:  A significant portion of the recommendations target preventing bias in AI-assisted medical devices. These include requiring developers to involve diverse populations during the design process and using a wider range of data in training AI models. The review also emphasises the importance of making AI models more transparent and easier to understand to identify and address potential biases.

  • Focus on Broader Equity: The review goes beyond pulse oximeters and racial bias. It recommends considering biases based on factors like sex and socioeconomic background in the design and development of all medical devices.

  • Enhancing Regulatory Frameworks: The recommendations call for strengthening regulations to ensure fair and equitable medical devices. This includes requiring manufacturers to submit data on how their devices perform in diverse populations during the approval process.

  • Preparing for the Future: The review acknowledges the rapid development of medical devices, particularly those involving large language models and AI. It recommends ongoing research and the development of frameworks to ensure these future technologies are designed and used equitably.


Growth and M&A for Healthcare Technology companies


Healthcare Technology Thought Leadership from Nelson Advisors – Market Insights, Analysis & Predictions. Visit https://www.healthcare.digital 


HealthTech Corporate Development - Buy Side, Sell Side, Growth & Strategy services for Founders, Owners and Investors. Email lloyd@nelsonadvisors.co.uk  


HealthTech M&A Newsletter from Nelson Advisors - HealthTech, Health IT, Digital Health Insights and Analysis. Subscribe Today! https://lnkd.in/e5hTp_xb 


HealthTech Corporate Development and M&A - Buy Side, Sell Side, Growth & Strategy services for companies in Europe, Middle East and Africa. Visit www.nelsonadvisors.co.uk  




Department of Health and Social Care independent report


Findings and recommendations of the independent review into racial, ethnic and other factors leading to unfair biases in the design and use of medical devices.


Recommendation 8


AI-enabled device developers and stakeholders, including the NHS organisations that deploy the devices, should engage with diverse groups of patients, patient organisations and the public, and ensure they are supported to contribute to a co-design process for AI-enabled devices that takes account of the goals of equity, fairness and transparency throughout the product’s lifecycle.


Engagement frameworks from organisations such as NHS England can help hold developers and healthcare teams to account for ensuring that existing health inequities affecting racial, ethnic and socio-economic subgroups are mitigated in the care pathways in which the devices are used.


Recommendation 9


The government should commission an online and offline academy to improve the understanding among all stakeholders of equity in AI-assisted medical devices.


This academy could be established through the appropriate NHS agencies, and should develop material for lay and professional stakeholders to promote better ways for developers and users of AI devices to address equity issues, including:


  • ensuring undergraduate and postgraduate health professional training includes the potential for AI to undermine heath equity, and how to identify and mitigate or remove unfair biases

  • producing materials to help train computer scientists, AI experts and design specialists involved in developing medical devices about equity, and systemic and social determinants of racism and discrimination in health

  • ensuring that clinical guideline bodies identify how health professionals can collaborate with other stakeholders to identify and mitigate unfair biases that may arise in the development and deployment of AI-assisted devices

  • encompassing an appreciation of AI within a whole-system and lifecycle perspective, and understanding the end-to-end deployment and potential for inequity

Recommendation 10


Researchers, developers and those deploying AI devices should ensure they are transparent about the diversity, completeness and accuracy of data through all stages of research and development. This includes the sociodemographic, racial and ethnic characteristics of the people participating in development, validation and monitoring of product performance.


This should include:


  • the government resourcing MHRA to provide guidance on the assessment of biases that may have an impact on health equity in its evaluation of AI-assisted devices, and the appropriate level of population detail needed to ensure adequate performance across subgroups

  • encouraging the custodians of datasets to build trust with minoritised groups and take steps with them to make their demographic data as complete and accurate as possible, subject to confidentiality and privacy

  • developers, research funders, regulators and users of AI devices recognising the limitations of many commonly used datasets, and seeking ones that are more diverse and complete. This may require a concerted effort to recruit and sample underrepresented individuals. We commend initiatives internationally and in the UK(such as the National Institute for Health and Care Research-led INCLUDE guidance) to encourage the development and use of more inclusive datasets. Data collection by public bodies must be properly resourced so that datasets are accurate and inclusive

  • dataset curators, developers and regulators using consensus-driven tools, such as those by STANDING Together, to describe the datasets that are used in developing, testing and monitoring

  • regulators requiring manufacturers to report the diversity of data used to train algorithms

  • regulators providing guidance that helps manufacturers enhance the curation and labelling of datasets by assessing bias, being transparent about limitations of the data, the device and the device evaluation, and how to mitigate or avoid performance biases

  • regulators enforcing requirements for manufacturers to document and publicise differential limitations of device performance and, where necessary, place reasonable restrictions on intended use

  • making sure that the Health Research Authority and medical ethics committees approving AI-enabled device research do not impose data minimisation constraints that could undermine dataset diversity or the evaluation of equity in the outcomes of research

Recommendation 11


Stakeholders across the device lifecycle should work together to ensure that best practice guidance, assurance and governance processes are co-ordinated and followed in support of a clear focus on reducing bias, with end-to-end accountability.


This should include:


  • MHRA adjusting its risk assessment of AI-assisted devices so that all but the simplest and lowest-risk technologies are categorised under Class IIa or higher, including a requirement for their algorithms to be suitable for independent evaluation, the use of a test of overall patient benefit that covers the risks of biased performance, and a requirement for manufacturers to publish performance audits with appropriate regularity that include an assessment of bias

  • supporting health professionals’ involvement early in the development and deployment of AI devices. We commend the use of ethical design checklists, which may assist in the quality assurance of these processes

  • all stakeholders supporting MHRA’s Software and AI as a Medical Device Change Programme Roadmap, such as promoting the development of methodologies for the identification and elimination of bias, and testing the robustness of algorithms to changing clinical inputs, populations and conditions

  • placing a duty on developers and manufacturers to participate in auditing of AI model performance to identify specific harms. These should be examined across subgroups of the population, monitoring for equity impacts rather than just unequal performance

Recommendation 12


UK regulatory bodies should be provided with the long-term resources to develop agile and evolving guidance, including governance and assurance mechanisms, to assist innovators, businesses and data scientists to collaboratively integrate processes in the medical device lifecycle that reduce unfair biases and their detection, without being cumbersome or blocking progress.


Recommendation 13


The NHS should lead by example, drawing on its equity principles, influence and purchasing power, to influence the deployment of equitable AI-enabled medical devices in the health service.


This should include:


  • NHS England and the NHS in the devolved administrations including a minimum standard for equity as part of the pre-qualification stage when establishing national framework agreements for digital technology

  • NHS England updating the digital technology assessment criteria used by health and social care teams when buying digital technology to recommend equity as part of the pre-purchase validation checks

  • ​​working with manufacturers and regulators to promote joint responsibility for safety monitoring and algorithm audits to ensure outcome fairness in the deployment of AI-assisted devices. This will require support for the creation of the right data infrastructure and governance

Recommendation 14


Research commissioners should prioritise diversity and inclusion. The pursuit of equity should be a key driver of investment decisions and project prioritisation. This should incorporate the access of underrepresented groups to research funding and support, and inclusion of underrepresented groups in all stages of research development and appraisal.


This should include:


  • requiring that AI-related research proposals demonstrate consideration of equity in all aspects of the research cycle

  • ensuring that independent research ethics committees consider social, economic and health equity impacts of AI-related research

Recommendation 15


Regulators should be properly resourced by the government to prepare and plan for the disruption that foundation models and generative AI will bring to medical devices, and the potential impact on equity.


A government-appointed expert panel should be convened - made up of clinical, technology and healthcare leaders, patient and public involvement representatives, industry, third sector, scientists and researchers who collectively understand the technical details of emerging AI and the context of medical devices - with the aim of assessing and monitoring the potential impact on AI quality and equity of LLM and foundation models.




Reaction to the Department of Health and Social Care's independent review


The reaction to the Department of Health and Social Care's independent review on equity in medical devices has been generally positive. Here's a breakdown of some key perspectives:


  • Patient Advocacy Groups:  These groups have largely welcomed the review. They see it as a significant step towards ensuring all patients receive fair and accurate diagnoses and treatment regardless of background. The focus on diverse data sets and broader equity is seen as a positive move for inclusivity in healthcare.

  • Medical Device Industry:  The industry acknowledges the importance of addressing bias in medical devices. However, there are concerns about the feasibility and cost of implementing some recommendations, particularly those related to wider data collection and upfront testing for diverse populations. Collaboration between regulators and manufacturers will be crucial in finding solutions.

  • Healthcare Professionals:  Doctors and nurses see the review's findings as concerning, highlighting potential blind spots in current practices. They believe clearer guidelines and training on using medical devices equitably are necessary. Additionally, ensuring transparency in AI-powered devices is seen as crucial for building trust with patients.

  • Government Response: The government has acknowledged the importance of the review's findings and expressed commitment to implementing its recommendations. They've already begun working with stakeholders to address immediate concerns like improving pulse oximeters.

Overall, the review has sparked important conversations about ensuring fairness and equity in medical technologies. The success of the recommendations will depend on collaboration between the government, healthcare providers, and the medical device industry.

Growth and M&A for Healthcare Technology companies


Healthcare Technology Thought Leadership from Nelson Advisors – Market Insights, Analysis & Predictions. Visit https://www.healthcare.digital 


HealthTech Corporate Development - Buy Side, Sell Side, Growth & Strategy services for Founders, Owners and Investors. Email lloyd@nelsonadvisors.co.uk  


HealthTech M&A Newsletter from Nelson Advisors - HealthTech, Health IT, Digital Health Insights and Analysis. Subscribe Today! https://lnkd.in/e5hTp_xb 


HealthTech Corporate Development and M&A - Buy Side, Sell Side, Growth & Strategy services for companies in Europe, Middle East and Africa. Visit www.nelsonadvisors.co.uk  





61 views

Comments


Screenshot 2023-11-06 at 13.13.55.png
bottom of page