Skip to main content

How do we demonstrate our compliance with our data protection obligations?

Contents

In detail

How do we assess and mitigate the data protection risks involved in our use of profiling tools?

Before deploying profiling tools in your trust and safety systems, you must consider:

  • what personal information you plan to process;
  • whether it’s necessary and proportionate to achieve your aim; and
  • the risks involved and how you’ll mitigate them.

The processing involved in using profiling tools is likely to result in a high risk to people’s rights and freedoms. This is due to the likelihood and severity of potential harm to users.

For example, profiling tools can:

  • collect information at a significant scale (including from users other than those who are the target of the tool’s analysis), leading to unwarranted intrusion and loss of control of personal information;
  • make decisions about people that have significant effects, leading to discrimination, reputational damage or financial harm (eg through loss of income or employment); and
  • use people’s information in ways they don’t expect and which lack transparency, leading to adverse impacts on people’s data protection rights.

It is also important to note that some of the processing activities that profiling tools typically undertake are specifically designated as high risk under data protection law.

For example:

  • profiling of people on a large scale;
  • processing involving new technologies, such as AI;
  • processing that involves tracking a person’s behaviour;
  • making decisions about a person’s access to a product or service based on automated decision-making (including profiling); and
  • using children’s personal information as part of offering a service directly to them.

See our guidance on data protection impact assessments for more information about high-risk data processing.

Given your processing is likely to be high risk, you must carry out a data protection impact assessment (DPIA) prior to processing personal information in your profiling tools.

A DPIA is an effective way to identify and mitigate data protection risks. A DPIA also brings broader compliance benefits because it can help you assess and demonstrate your compliance with all your data protection principles and obligations.

Your DPIA must:

  • describe the nature, scope, context and purposes of the processing, being clear about what personal information you want to process and why;
  • assess necessity and proportionality of your planned processing; and
  • identify all relevant risks to people’s rights and freedoms, assess their likelihood and severity, and detail measures to mitigate them.

The following sections consider each of these elements in more detail.

How do we describe the processing?

You must document how and why you plan to use profiling tools. This includes detailing the nature, scope, context and purposes of the processing.

You can find more details about how to describe your processing in our guidance on DPIAs. However, some examples of the information you should include in your description are:

  • how you collect and store the personal information;
  • who has access to the information and how it is stored;
  • what technologies you use in your profiling tools;
  • whether you use any third-party providers and, if so, how you make personal information available to them;
  • the types and volumes of personal information used by your profiling;
  • the extent and frequency of the processing that your tools undertake;
  • the security measures that you (and, if relevant, any third-party providers) have in place;
  • the number of users involved in the information processing;
  • what features of your service you deploy your profiling tools on;
  • the source of the personal information your tools process;
  • the reasonable expectations of your users;
  • the intended outcome of your processing;
  • the expected benefits of the processing for you and your service users; and
  • your legitimate interests (where necessary).

How do we assess necessity and proportionality?

You must demonstrate that the processing is a necessary and proportionate way to achieve your aim.

‘Necessary’ does not mean that processing personal information has to be absolutely essential. However, you must ensure that it is more than just useful or desirable.

This is also the case if you are considering using profiling tools to meet your OSA obligations. You must still assess whether the personal information processing undertaken by your tool is a necessary and proportionate way of achieving compliance.

To demonstrate necessity, you must:

  • show you have considered alternative, less intrusive measures and demonstrate why these are not sufficient to achieve your aim; and
  • if you have decided that profiling is necessary, ensure you implement it in the least intrusive way. This involves making sure your use of profiling is targeted and proportionate, and considering whether you can achieve your outcome using less personal information or less sensitive personal information (eg special category information or other information that could be deemed sensitive).

You must ensure that using personal information in your profiling tools is a proportionate way to meet your aim. This is closely linked to necessity and involves considering whether your planned processing is a reasonable approach.

As part of this, you should consider the following factors:

  • Is your purpose sufficiently important to justify the interference with your users’ privacy? There may be certain aims that are more likely to justify intrusion on your users’ privacy than others. For example, using profiling to prevent a serious threat to child users on your platform compared with detecting and removing a small number of fake accounts that are not causing significant harm. This assessment requires careful consideration of the planned profiling and the context you deploy it in.
  • What information do you have about the prevalence and impact of the issues you want to tackle? For example, you may hold user complaints or reports about certain types of harms on your service. This helps you understand whether profiling is a proportionate approach to tackle these issues, and what personal information processing is necessary. If you are considering using profiling tools to meet your obligations under the OSA, the risk assessments you are required to undertake under the OSA might support you in thinking about the proportionality of your planned information processing.
  • What risks and impacts will your profiling tools have? This includes considering any moderation actions you take based on, or supported by, profiling, and the nature and severity of those actions.
  • Will the profiling be limited in either time or service area, or will it be continuous and service-wide? Long-term continuous use of profiling across all parts of your service is more intrusive than a time-limited deployment, or a deployment that is ringfenced to certain parts of your service.
  • Would your users reasonably expect you to use their personal information to profile them for trust and safety purposes? Using people’s personal information in your profiling tools is more likely to be proportionate and justified if users expect you to use it, and understand the range of actions you may take on the basis of these tools.
  • Have you struck a fair balance between your interests, the rights of individual service users and the rights of your community of online service users?

How do we identify and mitigate risks?

You must consider the risks to people’s rights and freedoms that arise as a result of your planned profiling, including their likelihood and severity.

This involves considering the potential impact of your profiling on people, and any harm or damage this processing may cause. (See the section on How do we assess and mitigate the data protection risks involved in our use of profiling tools? for examples of the risks and harms that can arise from profiling.)

Assessing whether there is a high risk involves considering both the likelihood and severity of the possible harm.

For each of the risks you identify, you must consider and document what measures you plan to take to eliminate or reduce them.

The risks associated with using profiling tools varies depending on the personal information you use and how you configure your systems. However, some examples of options to consider for reducing risk include:

  • deciding not to collect certain types of information;
  • putting in place measures to limit how much information you use;
  • providing clear transparency information to users;
  • assessing the efficacy of your systems at regular intervals;
  • anonymising or pseudonymising information, where possible; and
  • using human reviewers to check the outputs of your profiling tools, where appropriate.

If your DPIA identifies a high risk that you cannot reduce to an acceptable level, you must consult us before going ahead with the planned processing.

Example

A video gaming platform is considering using a profiling tool to detect possible grooming activity in the forum feature of its service. The tool uses machine learning to predict whether grooming behaviour is taking place.

The service currently relies on user reports to detect this behaviour.

The service undertakes an assessment to determine whether it is necessary and proportionate to use this profiling tool. As part of this, the service considers whether user reporting alone is sufficient to detect and take action against this behaviour. It reviews user reports of grooming behaviours on its site, as well wider evidence it holds about the risk of grooming behaviours on its service.

Having considered the need for the tool and documented this, in this case the service concludes there are no less intrusive ways to identify and remove grooming behaviours. It decides to implement the profiling tool.

The service considers the risks to users that result from its planned profiling, and ways to mitigate those. Some of the risks identified by the service include:

  • the tool processing more information than necessary;
  • users being incorrectly flagged as exhibiting grooming behaviour (false positives); and
  • the processing not being clear and transparent to users.

The service ensures its deployment of the profiling tool is targeted by carefully considering what personal information the tool needs to operate effectively. It also limits use of the tool to the forum feature of its service, where evidence suggests it would be most beneficial. The service decides to undertake a review of the tool’s effectiveness every three months.

The service considers the accuracy of the tool and the impacts on its users as a result of its use. The service implements a human review process to review suspected cases of grooming that are flagged by the profiling tool.

The service tells its users about how it uses profiling in its trust and safety systems. It updates its privacy notice with information about the personal information processing involved.

Further reading

How do we integrate data protection by design and by default?

You must follow a data protection by design and by default approach when you decide to implement a profiling tool in your trust and safety systems.

This means you must:

  • put in place appropriate technical and organisational measures designed to implement the data protection principles effectively; and
  • integrate necessary safeguards into the processing, so you meet the UK GDPR’s requirements and protect people's rights.

In practice, this is about considering privacy and data protection issues at the design stage of any system and throughout its operation.

For example, you must:

  • make data protection an essential component of the core functionality of your profiling tools;
  • only process the personal information you need for the purposes you have specified, and you don’t use it for any other purposes; and
  • provide users with information so that they can easily understand how you are using their personal information in your profiling tools.

Please consult our guidance on data protection by design and default and privacy in the product design lifecycle for further information.

What if our profiling tools use children’s information?

Children’s personal information merits particular protection under UK GDPR. This is because children may be less aware of the risks involved when you are collecting and processing their personal information.

You must carry out a DPIA if you are profiling children as part of offering an online service directly to them.

If you are processing children's personal information, you should conform with our Children's code. The children’s code exists to help information society services likely to be accessed by children to understand how they can comply with their obligations to protect children’s personal information. The code sets out 15 standards that you should implement.

The children’s code recommends that you should switch profiling off by default unless you can demonstrate a compelling reason for profiling to be on by default.

Examples of compelling reasons include:

  • profiling to meet a legal or regulatory requirement (such as safeguarding);
  • to prevent child sexual exploitation or abuse online; or
  • for age assurance.

Compliance with your OSA obligations may be a compelling reason to enable profiling by default. However, it is important to remember that not every form of profiling constitutes a compelling reason. You must demonstrate that using profiling is a necessary and proportionate way to comply, taking into account the risks to people’s rights and freedoms.

We discuss necessity and proportionality in more detail in the section on How do we assess and mitigate the data protection risks involved in our use of profiling tools?.

You must also consider how to comply with the principles of data protection law more broadly. As part of this, you must ensure that your tools only process personal information that is necessary to meet your legal or regulatory requirements.

Who is the data controller for our profiling tools?

Profiling tools can be used in different circumstances and involve different organisations. For example, you could either develop your own tool or use one that another organisation provides.

If you develop your own profiling tool, you are the data controller for any processing you carry out when using the tool. This is because you have decided the purposes and means of that processing – the ‘how’ and the ‘why’.

If you use a tool developed by someone else, you must be clear about whether this other organisation only processes personal information on your behalf, or whether in practice both of you make decisions about the processing jointly.

If the tool provider is just carrying out processing for you under your written instructions, they are your data processor. You must only choose a processor that offers sufficient guarantees of data protection compliance.

If you and the tool provider jointly determine the purposes and means of the processing you carry out using the tool, you are joint controllers. You must look at the factual reality of the processing activities, and assess whether you and the tool provider are doing this in practice. If you are, both you and the tool provider must determine your respective compliance responsibilities.

With profiling tools, decisions about the ‘how’ and the ‘why’ of the processing can involve:

  • whether to deploy the profiling tool in the first place;
  • what you intend the profiling tool to achieve;
  • the categories and sources of the personal information the tool processes;
  • the moderation actions, if any, you take using your profiling tools; and
  • how long you keep the personal information for.

These are decisions that only controllers can take.

In comparison, and subject to the specifics of the arrangement, a processor can take certain day-to-day operational decisions. These include decisions about:

  • which IT systems the profiling tools use;
  • how these systems store personal information;
  • the security measures that apply to the personal information; and
  • how to retrieve, transfer, delete and dispose of the personal information.

These issues are important if you decide to use a profiling tool that someone else develops, particularly if it involves AI. For example, a tool that you don’t develop yourself may be designed in a way that impacts the level of control and influence you actually have over the processing it does. This can mean that you and the provider might be joint controllers for the processing activities.

So, in practice, each organisation involved in deploying profiling tools for trust and safety systems must carefully consider the nature of the processing activities and the level of control and influence it exercises over them.

How do we share personal information relating to our profiling?

If you decide to share the personal information used in, or created by, your profiling tools, you must consider whether the sharing is necessary to achieve your intended purpose.

For example, if you work with another organisation to monitor trends in user behaviour on your service, you must assess whether you can achieve your purpose by disclosing anonymised data rather than personal information about your users. (You can find more information about anonymisation in our anonymisation guidance).

In cases where sharing personal information is necessary, you must identify a lawful basis. If you are disclosing special category information or criminal offence information, you must identify additional conditions for processing. (See the section on How do we use profiling tools lawfully? for more information).

In addition, you should:

  • carry out a DPIA to help you assess the risks of your planned data sharing and determine whether you need to introduce any safeguards; and
  • put in place a data sharing agreement that sets out why you are sharing the information, what happens to it at each stage, and sets standards for the sharing.

Our data sharing code of practice provides guidance about how to share personal information with other organisations. It includes information about sharing information in an emergency and sharing information with law enforcement agencies.

Further reading

What do we need to consider if we transfer people’s personal information outside the UK?

Data protection law contains rules about transferring personal information to receivers located outside the UK. We refer to these as ‘restricted transfers’. (Read our guide on international transfers for more information.)

How do we demonstrate accountability?

The accountability principle in data protection law makes it clear that you are responsible for complying with data protection law and says you must demonstrate your compliance.

You must:

  • implement technical and organisational measures to ensure and demonstrate compliance with the UK GDPR;
  • ensure these measures are proportionate to the risks involved with your processing (which can vary depending on the amount of information you are processing, its sensitivity and the technology you use); and
  • review and update the measures as necessary.

You should:

  • check your existing practices against our expectations and guidance;
  • consider whether you could improve data protection practices;
  • understand ways to demonstrate compliance; and
  • ensure staff at all levels have a good understanding and awareness of data protection.