About this guidance
-
Due to the Data (Use and Access) Act coming into law on 19 June 2025, this guidance is under review and may be subject to change. The Plans for new and updated guidance page will tell you about which guidance will be updated and when this will happen.
In detail
- Why have you produced this guidance?
- Who’s it for?
- What does it cover?
- What doesn’t it cover?
- How do we use this guidance?
- How does this guidance relate to the OSA?
Why have you produced this guidance?
This guidance explains the data protection and privacy considerations to take into account when you use profiling in your trust and safety processes, including where you do so to comply with the requirements of the Online Safety Act 2023 (OSA).
This guidance follows our guidance on content moderation and data protection and you should read it alongside this guidance where relevant (eg if you deploy both content moderation and profiling tools in your trust and safety systems).
This series of products is part of our ongoing commitment to publish guidance on online safety technologies, alongside our work to ensure regulatory coherence between the data protection and online safety regimes. We announced this in our 2022 joint statement with Ofcom on online safety and data protection.
Who’s it for?
This guidance applies to any organisations that carry out profiling, as defined in the UK GDPR (see next section), as part of their trust and safety processes. It is aimed at user-to-user services who are using, or considering using, profiling to meet their obligations under the Online Safety Act 2023 (OSA). But it also applies to any organisations using, or considering using, these tools for broader trust and safety reasons.
When we refer to ‘services’ in this guidance, we mean providers of user-to-user services as defined in the OSA, and any other organisations that use profiling tools for trust and safety purposes.
This guidance is also for organisations that provide profiling products and services. We refer to these as ‘third party’ providers of trust and safety profiling tools.
It is for both controllers and processors.
When we refer to ‘trust and safety’ systems in this guidance, we mean the systems and processes that online services use to ensure users are protected from harmful and unwanted experiences.
Online services usually put trust and safety measures in place to:
- comply with the different regulatory regimes for online safety that exist worldwide; and
- ensure a safe experience for users more broadly.
Whether you use profiling to comply with the OSA or for other purposes, you must comply with data protection law.
The key audiences for this guidance are:
- trust and safety professionals;
- designers and engineers of profiling tools for trust and safety;
- those in roles with a data protection compliance focus (eg data protection officers and risk managers); and
- legal professionals.
What does it cover?
This guidance sets out how organisations deploying profiling tools, or providing these tools, for trust and safety systems can comply with the UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA 2018). In this guidance, we refer to these collectively as ‘data protection law’.
It also outlines how the Privacy and Electronic Communications Regulations 2003 (PECR) apply to profiling tools. This will be relevant where tools store information, or access information stored, on a user’s device (eg content or other information.).
When we refer to ‘profiling tools’ in this guidance, we mean trust and safety tools that involve profiling as defined in the UK GDPR.
Article 4(4) of the UK GDPR defines profiling as:
“any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements”
We provide more detail about the use of profiling tools in trust and safety systems, and the moderation actions that can be involved, in the section on How are profiling tools used in trust and safety systems?.
This guidance focuses on the use of profiling tools on user-to-user services. For the purposes of this guidance, we use the same definition of user-to-user services as in the OSA.
Section 3(1) of the OSA defines a user-to-user service as:
“an internet service by means of which content that is generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service”
What doesn’t it cover?
This guidance does not cover:
- profiling you might carry out as part of personalising or tailoring a user’s experience on your service. For example, profiling of users’ interests in order to recommend content to them;
- specific data protection considerations related to training and development of AI-based profiling tools (please consult our guidance on AI and data protection and the response to our consultation series on generative AI for more information);
- your specific obligations in the OSA - Ofcom is the UK's online safety regulator and has published codes of practice, guidance and other resources to support service providers to comply with their duties under the OSA; and
- profiling you use to estimate a user’s age (eg to decide whether a user can access your service or what content you recommend to them). However, it does apply to trust and safety profiling tools that use information about a user's age (either provided by the user directly or predicted using age estimation tools) as input data to assess other characteristics, attributes or behaviours. For example, a grooming detection tool that analyses age data about users, in addition to other data points, to detect suspected grooming activity.
For information about applying data protection law to age estimation, please consult our opinion on age assurance for the children’s code (also known as the Age appropriate design code).
How do we use this guidance?
To help you to understand the law and good practice as clearly as possible, this guidance says what organisations must, should, and could do to comply.
Legislative requirements
- Must refers to legislative requirements within the ICO’s remit or established case law (for the laws that we regulate) that is binding.
Good practice
- Should does not refer to a legislative requirement, but what we expect you to do to comply effectively with the law. You should do this unless there is a good reason not to. If you choose to take a different approach, you must be able to demonstrate that this approach also complies with the law.
- Could refers to an option or example that you could consider to help you to comply effectively. There are likely to be various other ways you could comply.
This approach only applies where indicated in our guidance. We will update other guidance to reflect this approach in due course.
This guidance is not a comprehensive guide to compliance. We have provided more details on some matters in our other guidance, and we link to this further reading where appropriate.
Children’s personal information merits specific protection under data protection law. This means you should take extra care if you use profiling tools that involve processing children’s information. If you process children's personal information, you should conform with our children's code. When we refer to a child we mean anyone under the age of 18. See the section on What if our profiling tools use children’s data? for more information.
How does this guidance relate to the OSA?
The OSA sets out rules for user-to-user and search services. Providers of these services have new duties to protect UK users by assessing and responding to risks of harm. This includes duties on user-to-user service providers to:
- use proportionate measures to prevent users from encountering certain types of illegal content;
- use proportionate measures to effectively mitigate and manage the risk of the service being used for the commission or facilitation of certain offences, as identified in the service’s illegal content risk assessment;
- use proportionate measures to effectively mitigate and manage the risks of harm to people, as identified in the provider’s illegal content risk assessment;
- use proportionate systems and processes to minimise the length of time certain illegal content is present on the service for; and
- use proportionate systems and processes to swiftly remove any illegal content after becoming aware of its presence on the service.
If a service is likely to be accessed by children, the OSA sets out duties to protect them. These include duties on user-to-user service providers to:
- use proportionate measures to mitigate and manage the risks of harm to children in different age groups, as identified in the provider’s children’s risk assessment;
- mitigate the impact of harm to children in different age groups presented by content that is harmful to children;
- use proportionate systems and processes to prevent children from encountering ‘primary priority content’ that is harmful to them (as defined in the OSA); and
- protect children in age groups judged to be at risk of harm from encountering other content that is harmful to children.
Note that the term ‘likely to be accessed’ by children under the OSA is different from how this term is used in our children’s code. Please consult our ‘likely to be accessed’ guidance and Ofcom’s ‘Children’s access assessments’ guidance for more information.
In addition to the safety duties, the OSA places additional legal obligations on providers of category 1 services. They must use proportionate systems and processes to ensure that taking down or restricting access to content, and suspending or banning users, is only carried out in accordance with terms of service.
Ofcom is the regulator for the OSA. It is responsible for implementing the regime and supervising and enforcing the online safety duties. Ofcom is publishing codes of practice and guidance that provide more detail about the regime and explain how you can comply with your OSA duties.
The OSA sits alongside data protection law. If you are using profiling, either to comply with the OSA or for other trust and safety related purposes, you must comply with data protection law.
Does the OSA say anything specifically about profiling?
There are no duties in the OSA that specifically require the use of profiling as defined in the UK GDPR.
Under the OSA, Ofcom has powers to recommend or require the use of ‘proactive technology’ in certain circumstances. This includes the power to recommend the use of a proactive technology in its codes of practice under the OSA (but not to analyse content, or metadata relating to content, communicated privately). There is also a power for Ofcom to require a service provider to use accredited technology to protect users from terrorism and child sexual exploitation and abuse (CSEA). For CSEA, this power can apply in relation to content and metadata communicated both publicly and privately. (You can find more information in chapter 5 of the OSA.)
Ofcom’s illegal content codes of practice for user-to-user services recommend that certain providers use proactive technology in some circumstances.
The OSA defines ‘proactive technology’ to mean:
- content identification technology;
- user profiling technology; or
- behaviour identification technology.
Each of these proactive technologies have specific definitions outlined in section 231 of the OSA.
The definition of user profiling in the OSA differs from the definition of profiling under the UK GDPR. However, we expect that both user profiling and behaviour identification technologies as defined in the OSA are likely to involve profiling as defined in the UK GDPR. (See the section on What does it cover? for the UK GDPR definition of profiling).
If you are using tools that involve profiling as defined by UK GDPR, either to comply with the OSA or for other trust and safety related purposes, you must comply with data protection law.
Further reading
- What is automated individual decision-making and profiling?
- Guidance on AI and data protection
- Information Commissioner’s Office response to the consultation series on generative AI
- Children's code including the section on Services covered by this code
- ‘Likely to be accessed’ by children guidance
- Commissioner’s opinion - Age assurance for the Children’s code
- Joint statement with Ofcom on online safety and data protection
Other resources
- Online Safety Act 2023
- Ofcom Children’s access assessments guidance
- Ofcom OSA codes of practice and guidance
- Illegal content codes of practice for user-to-user services
- Draft Protection of children codes of practice for user-to-user services
- Guidance on content communicated 'publicly' and 'privately' under the Online Safety Act