Microsoft 365 Copilot (pilot phase)
The ICO is trialling the use of Microsoft 365 Copilot (Copilot) to enhance staff productivity and efficiency. The pilot phase will focus on using Copilot in Microsoft Teams (Copilot in Teams) for a limited number of staff across the organisation. Less than 10% of staff will be given a licence to use Copilot in Teams for this pilot.
Copilot utilises large language models (LLMs), a type of artificial intelligence (AI) algorithm that uses deep learning techniques to understand, summarise, predict, and generate content.
The use of Copilot in Teams will cover several activities including:
- Generating pre-read material in preparation for meetings.
- During meetings, including summarising the meeting so far, or submitting prompts and questions to Copilot.
- Post-meeting to produce meeting notes or minutes, and to provide a summary of the meeting to staff who were not able attend.
The ICO’s data, used with Copilot, is stored within our secure Microsoft Azure tenant, which is held within the UK and is not used for training Microsoft models.
Where the ICO uses customers’ personal data with Copilot, this is processed under the lawful basis of 'public task' to fulfil our legal and statutory duties under the legislation, eg to consider a complaint, provide you with guidance, or investigate a breach. In other circumstances, your personal data will be processed under the lawful basis of “legitimate interest” eg in recruitment. For more information, please refer to the relevant sections in our privacy notice.
We may collect audio and visual recordings, as well as transcripts, of conversations. The transcription function in Teams allows Copilot to produce the output referred to above. Recordings and transcripts are automatically deleted after 21 days. Copilot prompts and responses are deleted after 7 days in line with our Teams chat retention policy.
No decision affecting a customer will be taken based on the Copilot output. All direct outputs from Copilot come with the statement "AI-generated content may be incorrect" and our training stipulates that human oversight is always required to verify outputs.
Following the pilot, we will carry out a full analysis of the use of Copilot in Teams, review the risks and the benefits, and agree our next steps. Any extension of the use of Copilot beyond the pilot will be considered under our risk assessment processes and the privacy notice will be updated accordingly. We will continue monitoring the deployment of Copilot in the pilot phase and will update our risk assessments as needed.