Regulating AI for the NHS: Why Implementation, Not Innovation, Is the Real Test

The UK does not need more AI rules in healthcare – it needs a framework the NHS can actually use, says policy institute Curia.

The UK’s AI healthcare challenge is not weak regulation, but fragmented implementation. A new response to the Medicines and Healthcare products Regulatory Agency (MHRA) National Commission into the Regulation of AI in Healthcare call for evidence argues that unless regulation is aligned with NHS decision making, assurance processes and clinical governance, safe and effective AI will continue to stall before it reaches patients.

From regulatory intent to NHS reality

The debate on artificial intelligence (AI) regulation in healthcare is often framed as a choice between safety and innovation. In reality, the central problem facing the NHS is neither the absence of regulation nor a lack of technical capability. It is the difficulty of navigating multiple, overlapping assurance and approval processes that were never designed for adaptive, data driven technologies.

The response submitted to the MHRA consultation makes a clear case: significant reform is needed, but not because standards are too weak. The challenge lies in how regulation is operationalised inside NHS organisations, where adoption decisions are made under pressure, risk is unevenly distributed, and accountability is often unclear.

Without reform that simplifies decision making while maintaining clear safety guardrails, AI will continue to be piloted repeatedly, adopted unevenly, or quietly avoided altogether.

Why the current framework slows adoption

From an NHS implementation perspective, regulation currently asks trusts to interpret evidence, risk and responsibility across several bodies at once: the Medicines and Healthcare products Regulatory Agency, the National Institute for Health and Care Excellence (NICE), NHS England and local digital governance. Each plays a legitimate role, but together they create friction.

Trusts struggle to answer basic operational questions. What can be deployed locally under existing governance? What requires additional assurance? What must be adopted nationally? In the absence of clarity, organisations default to caution. The result is duplication of pilots, delays in scaling and wide variation between systems.

Curia’s consultation response highlights that regulatory evidence alone is rarely enough to unlock adoption. NHS leaders need explicit guidance on what constitutes sufficient evidence to proceed, aligned to real world deployment decisions rather than abstract approval thresholds. Without shared expectations, trusts are left to reinvent evaluation processes at local level, adding cost without improving safety.

Post market surveillance must fit NHS governance

Nowhere is the gap between regulatory design and NHS reality clearer than post market surveillance. Current approaches too often treat AI oversight as a specialist or exceptional activity, rather than part of routine service delivery.

In practice, NHS organisations already operate clinical governance, patient safety and digital oversight structures. AI monitoring should be embedded into these processes, using familiar mechanisms such as audit, incident reporting and quality improvement cycles. Parallel reporting systems that sit outside day to day governance are unlikely to be sustained.

Local performance variation also matters. Evidence from NHS deployments shows that AI tools can behave differently depending on population characteristics, data quality and workflow integration. Effective surveillance therefore needs to be visible and actionable at trust level, not only through national dashboards or aggregated metrics.

The response also underlines the importance of distinguishing between static and adaptive systems in operational terms. Predetermined change control plans only add value if they are understandable and usable by local clinical and digital teams.

Curia and UKAI members said the UK does not need more AI rules in healthcare – it needs a framework the NHS can actually use.
Curia and UKAI members said the UK does not need more AI rules in healthcare – it needs a framework the NHS can actually use.

Clarifying responsibility without creating fear

Uncertainty over responsibility and liability remains a major barrier to adoption. Existing legal frameworks work in most cases, but leave gaps that create hesitation, particularly where AI is embedded into clinical pathways.

From an NHS perspective, responsibility must align with how care is delivered. Manufacturers should remain accountable for product safety and performance as supplied, including transparency on limitations and updates. Provider organisations should be responsible for safe deployment, governance and training. Clinicians must retain professional judgement, supported rather than undermined by organisational backing.

“The UK does not have an AI regulation problem in healthcare – it has an implementation problem. Unless regulatory intent is translated into clear, usable pathways for NHS organisations, safe and effective technologies will continue to stall before they reach patients.”

Ben Howlett, Chief Executive, Curia

Crucially, liability frameworks should reflect reasonable reliance on approved tools used as intended. If AI is encouraged or standardised at system level, it is neither fair nor practical to treat individual clinicians as the default bearer of risk when harms occur.

Fear based approaches suppress reporting and learning. A framework that prioritises shared accountability, transparency and improvement is more likely to support both safety and scale.

National co-ordination, not new tools

The consultation response is clear that the answer is not another task force or another layer of technology. What is missing is coordination.

Many AI tools are procured nationally but implemented locally. Stronger alignment between the Medicines and Healthcare products Regulatory Agency, the National Institute for Health and Care Excellence, NHS England and system level digital assurance would reduce duplication and give trusts confidence that adoption is genuinely supported at national level.

Regulatory sandboxes and controlled deployment models can help, but only if they align with NHS commissioning, procurement and workforce capacity, and produce learning that is transferable across organisations rather than locked into single site pilots.

Above all, capacity matters. Without practical guidance, implementation support and shared expertise, even well designed regulatory frameworks will struggle to deliver faster access for patients.

Turning reform into reality

The NHS does not need a wholly new regulatory model for AI. It needs clearer, more implementable guidance that reflects how decisions are made, risks are managed and services are delivered in practice.

The consultation response shows that the path forward lies in operational clarity, shared standards and embedded governance. Reform that focuses on these foundations can unlock safe, effective and scalable AI adoption, not by lowering the bar, but by making it usable.

If regulation is to support innovation in UK healthcare, it must first support implementation.

Find out more

Curia’s Health, Care, and Life Sciences Research Group members have received their copy of the submission.

If you would like to find out more about becoming a member, contact Partnerships Director, Ben McDermott at ben.mcdermott@chamberuk.com.

Share

Subscribe to our newsletter for your free digital copy of the journal!

Receive our latest insights, future journals as soon as they are published and get invited to our exclusive events and webinars.

Newsletter Signups
?
?

We respect your privacy and will not share your email address with any third party. Your personal data will be collected and handled in accordance with our Privacy Policy.

Never miss an issue by subcribing to our newsletter!

Receive our latest insights and all future journals as soon as they are published and get invited to our exclusive events and webinars.

We respect your privacy and will not share your email address with any third party. Your personal data will be collected and handled in accordance with our Privacy Policy.

Never miss an issue by subcribing to our newsletter!

Receive our latest insights and all future journals as soon as they are published and get invited to our exclusive events and webinars.

Newsletter Signups
?
?

We respect your privacy and will not share your email address with any third party. Your personal data will be collected and handled in accordance with our Privacy Policy.

Newsletter Signup

Receive our latest insights as soon as they are published and get invited to our exclusive events and webinars.

Newsletter Signups
?
?

We respect your privacy and will not share your email address with any third party. Your personal data will be collected and handled in accordance with our Privacy Policy.