The NHS Says You're Still Responsible (Even When AI Writes Your Notes)
Ali Vatan NHS England has published guidance on AI scribes in healthcare. The message is clear: the technology can help, but accountability stays with you.
NHS England published guidance on AI-enabled ambient scribing products in April 2025. These are the tools that listen to your clinical conversations and generate structured notes automatically. If you haven’t read it, you should. The central message: the technology can help, but accountability stays with you.
I think that’s entirely reasonable.
What the guidance covers
The NHS England guidance sets out a framework for responsible adoption of AI scribes across health and care settings. The key points:
- Clinicians remain accountable for their clinical records, full stop. If an AI scribe generates your notes, those notes are still your notes. You’re responsible for accuracy, completeness, and clinical appropriateness.
- Ambient scribing products using generative AI for summarisation (rather than simple transcription) are likely to qualify as medical devices and must meet relevant regulatory standards.
- All NHS organisations must ensure any ambient voice technology meets specified NHS standards. Non-compliant solutions, including those obtained through free trials, are not permitted.
- Organisations must complete a clinical safety risk assessment and a Data Protection Impact Assessment before deployment. Patient data from clinical sessions should be automatically deleted unless legally or operationally required.
The National Chief Clinical Information Officer also issued a priority notification warning about implementations not meeting clinical safety standards. The message was unambiguous: get this right or don’t do it at all.
This is the right approach
Some people in the AI space have called this guidance overly cautious. I disagree.
Of course clinicians should be accountable for their records. That’s not a new principle; it’s a foundational one. The fact that a machine wrote the first draft doesn’t change who’s responsible for the final product. If you sign off on notes, they’re yours. That was true when your dental nurse wrote them, and it’s true when an AI writes them.
The guidance doesn’t say you can’t use AI scribes. It says you need to use them responsibly, with proper governance, risk assessment, and human oversight. That’s not anti-innovation. That’s good clinical practice.
The detail problem, and why training matters
AI-generated clinical notes are only as good as the model producing them, and not all models are equal.
The risk with ambient scribing is that the AI misses something: a detail mentioned in passing, a subtle clinical finding, a piece of patient history that matters. If you sign off without catching the omission, that gap becomes part of the official record.
But certain large language models are genuinely good at maintaining detail across large context windows. If the model is designed to be comprehensive rather than concise, and specifically trained not to leave out details, the risks can be effectively mitigated. So long as it’s trained properly, this technology can produce notes that are not just adequate but genuinely thorough.
The problem isn’t AI scribes as a concept. The problem is poorly implemented ones: systems that prioritise brevity over completeness, that haven’t been validated in clinical settings, that haven’t been trained on the specific demands of healthcare documentation.
The BDJ’s analysis raised an important point: clinicians are now in what some have termed a “liability sink” for AI tools. You’re accountable for the output, but you didn’t produce the output. That’s an uncomfortable position, and it demands you actually understand what the tool is doing and verify its work.
Human in the loop, always
You can’t take responsibility for something you haven’t grasped. If an AI scribe generates notes and you rubber-stamp them without reading them properly, you haven’t fulfilled your duty of care. You’ve just automated your accountability away while keeping the liability.
The human in the loop isn’t optional. It’s essential. And “in the loop” doesn’t mean glancing at the output and clicking approve. It means reading the notes critically, checking them against your clinical memory of the consultation, and correcting anything wrong or missing.
This is work. It takes time. But it’s considerably less time than writing everything from scratch, and if done properly, it gives you a better record than most clinicians produce manually, because the AI captures things you might have forgotten to document.
What this means for dental practices
For dentists specifically, AI scribes offer real potential. Clinical note-taking is one of the most time-consuming administrative tasks in practice, and anything that reduces that burden without compromising quality is welcome.
But the NHS guidance applies to all health and care settings, and dental practices (whether NHS or private) need to take it seriously. If you’re using or evaluating an AI scribe:
- Verify compliance. Check that the product meets NHS standards if you’re an NHS practice. Even private practices should use these standards as a benchmark.
- Read the notes. Every single time. Don’t let the convenience of automation become an excuse for skipping review.
- Understand the model. Ask your vendor how the AI handles clinical detail. Does it summarise aggressively? Does it preserve nuance? Has it been validated in dental settings?
- Complete your risk assessments. A clinical safety risk assessment and DPIA aren’t box-ticking exercises. They force you to think through the real risks.
- Keep patient consent front and centre. Patients should know their consultation is being recorded and processed by AI. Transparency isn’t optional.
Powerful assistants, not autonomous systems
AI scribes are coming to dentistry; the efficiency gains are too significant and the technology is improving rapidly. But the NHS guidance is a timely reminder that efficiency without accountability is dangerous.
The clinicians who get this right will treat AI scribes as powerful assistants, not autonomous systems. Review the output, challenge it when something doesn’t look right, and never forget that your name is on those notes.
References
- NHS England. “Guidance on the use of AI-enabled ambient scribing products in health and care settings.” April 2025. england.nhs.uk
- NHS England. “AI-enabled ambient scribing products in health and care settings.” england.nhs.uk
- British Dental Journal. “NHS England guidance on AI scribes.” Nature, 2025. nature.com/articles/s41415-025-9061-0
- BDJ Team. “AI and record-keeping.” Nature, 2025. nature.com/articles/s41407-025-3071-2
- Chronicle Law. “NHS and AI Scribes.” January 2026. chroniclelaw.co.uk