Skip to main content

How do we ensure data minimisation in our content moderation?

Contents

The data minimisation principle means you must use personal information in content moderation in a way that is adequate, relevant and limited to what is necessary.

You must take particular care when processing a child’s personal information.

Content moderation technologies and methods are capable of gathering and using more information than may be necessary to achieve your purposes. This risks unnecessary intrusion into your users’ privacy. 

In many cases you can make accurate content moderation decisions based solely on the content. If so, you must avoid using other personal information associated with the content or user’s account.

Moderation of content can be highly contextual. Sometimes, you may need to use other types of personal information (beyond just the content) to decide whether you need to take moderation action, including users’: 

  • previous posts on the service;
  • records of previous content policy violations;
  • interactions on the service (such as likes and shares); and
  • interests.

You are complying with the data minimisation principle, as long as you can demonstrate that using this information is:

  • necessary to achieve your purpose (eg because it ensures your decisions are accurate and fair); and
  • no less intrusive option is available to achieve this.

You must be clear about:

  • what personal information you anticipate is necessary to make decisions about content on your service; and
  • the circumstances when you might need to use this information.

You should:

  • document this and be able to justify it;
  • keep your record of this under review in case you determine further types of personal information as being necessary in future;
  • consider using pseudonymisation to achieve data minimisation, where appropriate. Pseudonymisation helps reduce the risks to people and improve security;
  • provide clear guidance and training for your moderators (including any volunteer or community moderators you may use). This will help them understand what personal information to use in their decision-making and the requirements of data protection law; and 
  • ensure that moderators understand when to escalate decisions about content.

You could implement access controls to ensure that human moderators are only able to view and access personal information that is relevant to inform moderation decision-making. For example, using an interface that only displays the content to a human moderator in the first instance, with the option to apply for access to additional user details, if required. 

If you are using third-party moderation providers, you must limit the information you give them to what is relevant and necessary for them to carry out moderation.

Example

A service deploys a content moderation process that aims to detect content that violates its content policies.

A piece of content is flagged as potentially violating the services content policies and the system sends it to a human moderator for review.

Initially, the moderator reviews the content on an interface that displays only the content.

In this case, the content alone is not sufficient to make an accurate judgement about it. Therefore, the moderator needs to analyse additional personal information to support their decision. 

The moderator consults guidelines provided by the service that explain what additional personal information may be necessary for their decision-making.

Following the guidelines, the moderator applies for access to view the user’s previous posts in the thread and their moderation history on the site. The moderator’s access to this information is logged by the service.

This represents good practice under the data minimisation principle. This is because the information used by the moderator is kept to the minimum needed for them to make an accurate decision. 

Data minimisation and illegal content judgements under the OSA

Data minimisation still applies when services use personal information to make illegal content judgements under section 192 of the OSA. Under data protection law, this means you must use personal information that is proportionate and limited to what is necessary to make illegal content judgements.

To comply with data protection law, you should:

  • be clear, in advance, about what personal information you may need to make illegal content judgments;
  • document this and keep it under review; and
  • provide clear guidance for your content moderators on using personal information to make illegal content judgements.

Further reading