BSI paper assesses need for standards for AI in healthcare

NEWS
COMMENTS 0

BSI, the business standards company, has undertaken research in collaboration with the US standards organisation for medical devices, the Association for the Advancement of Medical Instrumentation (AAMI), to analyse the role that standardisation can play in assisting the deployment of AI solutions in healthcare. This research has been undertaken with support from UK medical device regulator, MHRA.

 

Both the UK and US are acknowledged leaders in innovation for medical technologies and digital healthcare, with regulatory regimes that are considered to be amongst the most rigorous and responsive to innovation worldwide. (MHRA and US FDA have a shared interest in the recommendations in this report.)

 

Novel medical software can offer earlier diagnosis and targeted treatments for patients whilst ensuring efficiency in healthcare services. However, the emergence of AI and machine learning algorithms in what is already a highly regulated sector, is posing a challenge to future governance and regulation, in terms of safety and effectiveness.

 

Technology and the move towards core digitisation featured strongly in the NHS Long Term Plan. The plan also acknowledged the need to progress with AI and digital technology in a way that subjects these solutions to the same level of scrutiny that would apply to any other technology. This includes the need for suppliers of such technology to comply with open standards to enable interoperability and continual improvement.

 

The BSI research explored specific challenges relating to the deployment of AI in healthcare, including the ability of an algorithm to change its output in response to new data, and the level of autonomy introduced by the use of such software.

 

Anne Hayes, Head of Governance and Resilience at BSI, says: “The healthcare sector is embracing AI with the expectation it can revolutionise patient care in the future, yet this must be balanced with the need to ensure consistency of safety, effectiveness, scalability and fitness for purpose. The recommendations offered in this position paper will ensure that we have a robust standardisation framework to support the effective deployment of these innovative solutions.”

 

Mark Birse, Group Manager, Device Safety and Surveillance at the Medicines and Healthcare Products Regulatory Agency, adds: “We live in an increasingly digital world. Healthcare professionals, patients and the public are using software and stand-alone apps to aid diagnosis and monitor health. Making sure these new software devices including those using artificial intelligence, are safe and effective is a challenge for developers as well as users.”

 

The paper recommends a phased programme of standardisation activities, including development of guidelines to cover AI terminologies and validation approaches.

 

Recommendations:

* Create an international task force to provide oversight for the direction of standardisation activities related to AI in healthcare

* Undertake a mapping exercise to review the current standards landscape and identify opportunities for new or modified publications and navigation tools to meet the needs for deployment of AI in healthcare

* Create an expert group to develop a scope and proposal for a standard covering terminologies and categorisation for AI in healthcare, taking note of the current work programmes in related standardisation communities

* Create an expert group to develop a scope and proposal for guidance to cover validation processes for AI in healthcare (particularly the aspects related to continuous learning)

* Create a communications and engagement plan that will continue to build our understanding of the market challenges and educate communities on the benefits that standardisation can bring to the deployment of AI in healthcare.



Have Your Say

There are currently no comments for this article