Embracing AI Responsibly - Interim Guidance 

The information below provides some important interim guidance on the responsible use of Artificial Intelligence (AI) tools. This reflects our growing journey with AI as part of the government's 10 Year Health Plan for England and our digital plans as a Trust.  

What you need to know 

AI tools are already changing how we work, helping us save time and work more efficiently. Many colleagues are now using Microsoft Copilot Chat (available free with your NHS Microsoft account) for quick, conversational support - such as summarising documents, generating ideas and presentation slide decks, or answering questions. Others are exploring the enhanced capabilities of Microsoft Copilot (available with an additional paid licence) to support more advanced tasks. Both tools are proving to be valuable aids in our day-to-day work. 

AI undoubtedly has a key role to play in the future of health and care. It’s essential, however, that we introduce these new technologies in a safe, responsible and appropriate way. 

Appropriate use of AI  

Copilot and Copilot Chat are the only AI tools currently approved for work use within Mersey Care NHS Foundation Trust. When accessing these tools, please ensure you are logged in using your NHS Microsoft account. 

It is essential that these tools are used responsibly in accordance with data protection best practice and that you carefully check the information generated for any inaccuracies.  

Although the information entered into Copilot Chat and Copilot remains secure within our NHS digital network, care record data must not be inputted from other systems.  

AI tools that have not been approved for use - such as ChatGPT or Otter AI - must not be used for work purposes. 

If you join a meeting where confidential or sensitive information is being discussed and a third-party AI tool is in use (e.g. labelled ‘AI notetaker’, ‘Meeting Agent’ or appearing as an unknown participant), please ask for the tool to be switched off.  

This is because third-party AI tools may store data outside the NHS’s secure digital infrastructure, and potentially outside the UK - posing risks to data privacy and security. 

If you're unsure how to proceed, speak to your line manager or contact your Information Governance (IG) team before continuing. 

AI in clinical practice 

Copilot Chat and Copilot must not be used for informing or supporting clinical decision-making, direct patient care, or any activity requiring clinical judgement.  

Whilst transcription can be used in a Microsoft Teams meeting, Copilot Chat and Copilot should not be used to summarise clinical content and users must not request, generate, or act on AI outputs in any clinical context. 

If you are considering the use of any AI tool for clinical purposes, please contact the Digital Clinical Team.  

In line with NHS England guidance, any AI tool that generates or summarises patient information is considered a medical device and must meet the following requirements before it can be used in clinical settings: 

  • Completion of a Clinical Safety Assessment in accordance with DCB0160 standards 

  • A Data Protection Impact Assessment (DPIA) to ensure lawful and secure data use 

  • Compliance with MHRA Class I medical device requirements if the tool performs clinical summarisation or documentation 

What’s happening now? 

An AI Assurance, Safety and Oversight Group has been established which will oversee all AI requests and pilots. This group will guide our safe and effective adoption of AI technologies across the Trust.  

We are also working closely with digital, clinical safety, IT security and IG teams to complete the necessary assessments. Looking ahead, we are hopeful that by the end of the year, we will launch our first Ambient Voice Technology (AVT) pilot - a major milestone in our AI journey. 

Additional guidance will be shared soon, along with training to support everyone in using our approved AI tools confidently and safely.