In recent guidance issued by NHS England's national chief clinical information officer, Alec Price-Forbes, doctors and hospitals in England were issued with warnings to immediately stop using any artificial ambient scribe tools (also known as Ambient Voice Technology (AVT)) that had not been officially registered for medical use and which had not been approved via the NHS data governance and clinical safety standards.
The findings of a recent Sky News investigation into the use of AI in healthcare come at the same time as the announcement that the UK is to become the first country to join a new international network of health regulators focused on the safe and effective use of AI in healthcare. On 24 June, the Medicines and Healthcare products Regulatory Agency (MHRA) joined the Health AI Global Regulatory Network as a founding ‘pioneer’ member, placing the UK as key party in focusing efforts to shape international rules and oversight for AI, where used in clinical care, to enhance patient care whilst also maintaining ethical and safety standards.
The use of AI tools such as AVT in a healthcare or clinical setting appears to be a key focus within the industry as a means to improve efficiency and outcomes as well as drive innovation. But it also raises a number of key issues in relation to legal and regulatory compliance, including:
- Data Protection Risks: Incorporating patient data into AI systems which are not approved, and therefore may not meet the necessary technical and organisational security standards under UK GDPR, carries a significant risk of breach of data protection law obligations – particularly as many of these tools will process "special category" personal data, requiring consent of the service users or other legal basis before that processing can take place. Having a properly drafted and robust data protection framework in place is therefore important when considering the use of these types of technologies. This will help to ensure that there is an appropriate lawful basis for processing personal data in this way, that patients are made aware of the fact that their data is being used in this way through properly drafted privacy notices and documented DPIAs for use of the relevant tools, and that any provider's security standards are adequate under UK GDPR.
- Regulatory Liability and Clinical Risk: Deploying tools which have not been properly registered can breach not only data protection laws but also medical device regulations, clinical governance and the professional standards which underpin end user trust in the services being provided. Additionally, the use of AI tools which have not been properly approved and validated can result in AI "hallucinations" causing potential instances of mis-documentation and clinical error, increasing the risk of regulatory action.
- Data Breaches: The use of varying unapproved technologies presents a risk of fragmented and insecure data storage, which increases the risk of data protection breaches which have the potential to further undermine user trust in the services being provided.
Although there are potential risks present as a result of the use of AVT and other AI tools in a clinical or healthcare setting, the use of these technologies can also offer significant benefits to healthcare practitioners, including reducing administrative burden and allowing healthcare practitioners and clinicians to focus on their day jobs, potentially improving patient care.
Balancing innovation with responsibility
Whilst AI can offer many benefits in healthcare settings, such as reducing the burden on clinicians and improving record keeping, it also carries significant legal, ethical, and trust risks if implemented too rapidly or without due oversight.
It is therefore important that organisations using AI tools in a clinical setting take steps to define standards, balance innovation and compliance and ensure that patient trust remains centric in the development of AI integration.
If you have any questions about your organisation's responsibilities in relation to the use of AI in a healthcare or clinical setting, please get in touch with Valerie Armstrong-Surgenor, Melissa Hall or Lauren McLeod.