Skip to main content

Children’s code strategy progress update – March 2025

Contents

Latest updates - updated 03 March 2025

3 March 2025 - this progress update was published.

Introduction

This update sets out our progress on protecting children’s privacy online. It shows the impact our work has had so far, including how platforms are delivering for children and which ones have more work to do.

Our approach was set out in our Children’s code strategy, which explained that our focus in 2024/25 is on social media platforms (SMPs) and video sharing platforms (VSPs). In our August 2024 update we outlined the findings from the first phase of our work. This included a review of 34 SMPs and VSPs 1 and our initial action.

This update includes a comparison table setting out key metrics that informed our review to provide further transparency on our findings to date. We are also publishing a summary of the responses received to our call for evidence. This is informing our work on recommender systems and the use of personal information of children under 13 years old.

Summary of our impact

Our strategy focuses on the ways SMPs and VSPs protect children’s information online, looking at their use of:

  • default privacy settings;
  • default geolocation settings;
  • targeted advertising;
  • recommender systems; and
  • the information of children under 13. 

Following the initial interventions set out in our August update, we have seen the following improvements:

  • Five platforms have improved, or committed to improve, the default privacy settings for children that protect their personal information. This includes making profiles private when they sign up to a service or limiting the extent to which children can change high privacy default settings. 
  • Four platforms have improved the default geolocation settings on their service. This includes no longer automatically including geolocation information in children's profiles, and stopping children from posting their location. These changes help keep children safer in the physical world.
  • Two platforms have turned off, or have committed to turn off, personalised adverts for all under 18s. This ensures that children’s default advertising experience is not based on their behavioural data or profiles.
  • One platform has committed to introduce age assurance methods, to help ensure that children have an age-appropriate online experience. 

In addition, we have three live investigations open. These are assessing how:

  • TikTok processes the personal information of 13 – 17 year olds in their recommender systems; and 
  • Imgur and Reddit process the personal information of children in the UK and their use of age assurance measures.

You can see further information on the actions taken so far in Table 1. We have noted where platforms have already made improvements and where they have committed to, but not yet made, changes. We will monitor platforms yet to introduce improvements, to ensure that they implement their planned changes as expected. 

We will also continue to push for further improvements in practices which do not comply with the law or conform to the Children’s code. The comparison table provides further information on ongoing activities and covers all platforms in our sample. The ongoing work includes supervisory activity, where we are scrutinising the approach platforms take in specific areas through compliance discussions and written enquiries. We stand ready to open further investigations, progressing to formal enforcement action, where appropriate. 

Our work to drive further improvements in children’s privacy will continue in 2025/26. We will publish a further update in due course. 

Table 1 - Summary of action taken following our initial interventions

Service Action taken
BeReal

Improvements made.

  • BeReal has stopped processing children’s precise geolocation data (at street or landmark level). This means that under 18s now only have the option of including approximate city-level location when posting.
Dailymotion

Improvements made.

  • Dailymotion implemented new privacy and transparency measures, including providing additional notifications to children before they upload videos to remind them not to disclose personal information.
  • Dailymotion added warnings to remind children to be cautious when drafting video descriptions so that they do not share personal information. Dailymotion has also improved their guidance for children.
Discord

Compliance discussion ongoing.

  • We continue to have compliance discussions with Discord about their approach to deliver high privacy by default. This accounts for the existing measures they have in place, which they consider operate in the best interests of the child. 
Frog

Information notice 2 sent, response being reviewed.

  • We wrote to Frog about their default privacy and geolocation settings. When they did not respond, we sent them an information notice. We are assessing the evidence they subsequently provided and our compliance assessment is ongoing. 
Fruitlab

Exited UK market.

  • We wrote to Fruitlab about the processing of personal information of children under 13 years old. When they did not respond, we sent them an information notice. Fruitlab has since exited the UK market. 
Hoop

Commitment to make improvements.

  • Hoop has committed to ensure that, by default, children's profiles are private by the end of Q1 2025.
Imgur

Investigation open.

  • We wrote to Imgur about the processing of personal information of children under 13 years old. When they did not respond, we sent them an information notice. We have since launched an investigation into how Imgur processes children’s personal information and their use of age assurance.
Reddit

Investigation open.

  • We have an ongoing investigation looking at how Reddit processes children’s personal information in the UK and their use of age assurance measures.
Sendit

Improvements made.

  • Sendit has stopped automatically populating user profiles with location information.

  • Sendit has also introduced new in-app location settings to make it easier for users to enable or disable location services, giving them better control of their information.

Soda

Improvements made.

  • Soda has removed country-level location information that was previously automatically included in children’s profiles. This means that children have greater control over their personal information, including their location data.
TikTok

Investigation open.

  • We have opened an investigation into how TikTok processes the personal information of 13 – 17 years old in the UK in their recommender systems .
Twitch

Commitment to make improvements.

  • Twitch has committed to change the default settings for teen users in the UK to ensure that, by default, no clips can be made or shared for teen streamers in the UK in the first half of 2025. 
Vero

Commitment to make improvements.

  • Vero has committed to introduce age assurance measures by the end of June 2025.
  • Vero has also committed to introduce new protections for children between 13-17 years of age by the end of June 2025. This includes a ‘safe mode’ which locks certain profile settings to the most privacy-friendly by default, with a limit to what users can change.
  • We have also received information from Vero about their geolocation settings. We are continuing our compliance discussions with them about their approach to private profiles and geolocation standards.
Viber

Commitment to make improvements.

  • Viber has committed to turn off personalised advertising for 17-year-olds. Previously this was only off by default for children up to 16 years old.
  • They have also committed to extend privacy protections to all users under-18 to ensure only known contacts can add them to groups by end of Q1 2025 (this protection is currently in place for children aged 16 and under).
Vimeo

Improvements made.

  • Vimeo informed us of planned changes to their UK platform to ensure that all new accounts set up after November 2024, are private by default irrespective of age. These changes include only enabling users to view videos on the platform if they are provided access via a direct link to the video. This means that user's personal information and its visibility is within the user's control.
  • We are following up with Vimeo on their approach to age assurance and applying the standards of our code.
X

Improvements made.

  • X has stopped serving ads to users under 18 years old.
  • X has also removed the ability for under 18s to opt-in to geolocation sharing (though this change does not apply to ~1209 existing users who previously opted in to this feature).
  • X has improved the privacy and transparency materials available for under 18s, creating a dedicated help centre for child users and parents.

Default privacy settings

Our code states that settings must be ‘high privacy’ by default for children, 3  unless organisations can demonstrate a compelling reason for a different default setting. 

The code is not prescriptive about how organisations should deliver this. However, the code does set out that other users should only see a child’s information if the child amends their settings to allow this, unless there is a compelling reason to do otherwise. 

Organisations can achieve this by making children’s profiles private by default. Alternative approaches might include:  

  • enabling children to have public profiles that do not contain personal information; 
  • providing settings to control the visibility and searchability of children’s content and associated profile information;
  • providing settings or safeguards to prevent children receiving messages from strangers; or
  • limiting the visibility of children’s personal information in profiles to "friends only." 

In August 2024, we wrote to five organisations about the approach they took to default privacy settings. As set out in Table 1, two of these platforms, Dailymotion and Twitch, are introducing changes to their approach to improve children's privacy; while we continue to assess the approach taken by Frog and Discord. Sendit has set out the measures that they have in place to protect users’ privacy and, on the basis that they deliver high privacy by default, we do not propose to take any further action but would revisit the position if relevant changes were made.

In addition, we have secured commitments from Hoop to make children’s profiles private by default. Vero and Vimeo have made changes, or committed to make changes, to improve their approach to default privacy settings (see Table 1).

Default geolocation settings

Our Children’s Data Lives research suggests that sharing geolocation is becoming more normalised. Children feel it is an expression of trust to share their location with close friends and family. 

However, sharing geolocation can also bring risks. Our code makes it clear that organisations should switch off geolocation privacy settings by default unless there is a compelling reason to do otherwise, considering the best interests of the child. In addition, providers should not use nudge techniques to encourage children to turn off privacy protections.

Our review found that platforms typically have precise geolocation sharing off by default and do not share children’s locations automatically with other users. However, some appear to nudge children to switch settings on or encourage them to share their location with others.  

In August 2024, we wrote to five organisations about their geolocation settings to set out our expectations and clarify their practices. As a result of our intervention, BeReal, Sendit and Soda have introduced changes to improve their approach (see Table 1). We continue to have compliance discussions with the other two platforms, Frog and Vero

In addition, we have secured an improvement in X’s approach, which has removed the ability for under 18s to opt-in to geolocation sharing.

Profiling children for targeted advertisements

Organisations often use profiling to facilitate targeted advertising practices. Our code states that organisations should switch off profiling by default unless they can demonstrate that there is a compelling reason for a different setting, taking into account the best interests of the child.

Our review found that some platforms have taken the decision not to show any advertising to children, while others only use very limited data points, such as age and high-level location data. This helps ensure that advertising is age and jurisdiction specific. However, it was not always clear on some platforms what personal information they were collecting from children and how they were using it for targeted advertising. 

We met with a number of platforms to set out our expectations and to clarify their approach. Most have confirmed that they make very limited use of children’s data for advertising purposes (see the comparison table). However, following our intervention:

  • X has stopped serving advertisements to children.
  • Viber have committed to extend their 'off-by-default' setting for targeted advertising so that 17-year-old users also benefit from this protection (which currently only applies to children under 17 years old). 

Use of children’s information in recommender systems

Recommender systems are algorithmic processes that use personal information and profiling to learn user preferences and interests in order to suggest or deliver content. 

Our code sets out that organisations should switch off profiling by default. This applies unless they can show that there is a compelling reason for a different setting, taking into account the best interests of the child. Organisations can only use profiling if there are measures in place to protect the child from any harmful effects, in particular being fed content that is detrimental to their health and wellbeing.

Our review found that recommender systems use a wide range of children’s personal information. Platforms provide limited details about how they use the information to make recommendations, or what measures they take to protect children’s privacy when doing so. 

Since publishing the update, we have further developed our understanding of the data processing elements associated with these systems by:

  • writing to a number of SMPs and VSPs to learn more about how they use children’s personal information in their recommender systems; and 
  • launching a call for evidence on this issue. Our summary of the responses received set out the concerns that some stakeholders raised about the way recommender systems operate to maximise user engagement and the lack of transparency about the amount of personal information the systems collected and how they used it. The responses also outlined how recommending content to children can lead to extended online use, and the risk of children being exposed to harms.

We have concerns about the volume and range of children’s personal information that these systems use, and whether they have sufficient protections in place for children. 

SMPs and VSPs that use children’s personal information in recommender systems must consider carefully whether their approach is lawful, fair and transparent. They also need to ensure they keep the personal information they collect to the minimum amount necessary. There are growing concerns about recommender systems using information generated by children’s online activity to recommend inappropriate and harmful content, at times in significant volume, and in ways that prolong their engagement and damage their wellbeing.

We have recently launched an investigation to examine how TikTok, one of the most popular platforms used by young people, processes the personal information of 13 -17 year olds in the UK in their recommender systems. 

We will continue to work with Ofcom as the regulator responsible for online safety, recognising the interactions between our remits in the regulation of recommender systems.

Use of information of children under 13 years old

Data protection law and the code seek to ensure that all children have an age-appropriate experience online. Younger children are likely to need greater protection than older children due to their earlier stage of development. The UK GDPR reflects this, as services must get parental consent if they rely on consent as the lawful basis for processing the personal information of children under the age of 13. 4 

Our review found that most platforms in our sample have terms of service that specify a minimum age of 13. In addition, many appear to rely on consent for at least some of their data processing activities. These platforms need to have an effective means to ensure they are not unlawfully processing the information of under 13s, for instance age assurance or parental consent. 

Our initial priority was to act where platforms did not appear to use any age assurance at account set-up stage. 5 We wrote to four platforms to clarify their approach. Following on from this: 

  • Vero has committed to introducing age assurance measures by the end of June 2025.
  • We have opened an investigation into how Imgur assesses the age of their users.
  • Fruitlab has exited the UK market.
  • Our compliance discussions with Vimeo are ongoing about their approach to age assurance and applying the standards of our code. 

In addition, we have an ongoing investigation into Reddit about the processing of personal information of children in the UK and their use of age assurance measures. 

Our review found that a large number of SMPs and VSPs appear to rely on users’ self-declaration of their age (see the comparison table). 

We understand that some of these platforms use profiling as well as self-declaration, to identify users under the age of 13. There is currently limited evidence available on the effectiveness of this method. 6 We are writing to platforms to better understand their approach and will consider next steps in light of the information we receive. 7

Where platforms rely on self-declaration alone, this approach is unlikely to be effective if there are significant risks to children from data processing on the service. These platforms should adopt an age assurance method with an appropriate level of accuracy and one that is fair to users. 8 Driving improvements here will be a priority for us in the coming months, with a focus on services where the risks to children are likely to be higher.

Our work with Ofcom continues and will ensure that our policies are consistent with each other’s regulatory requirements and guidance. We also continue to work with international regulators, for example, in September 2024, we published a joint statement on a Common International Approach to Age Assurance, which has been signed by 11 data protection authorities around the world. 

1 We created accounts using proxies for children of different ages to replicate the sign-up process and observed key account settings and privacy information.

2 An information notice is a formal request for an organisation or individual to provide us with information, within a specified time frame, to assist us with our investigations. In some circumstances it may be a criminal offence to provide a response which is false in any material respect.

3 There are risks when the personal information within profiles is made public. This could for example facilitate bullying or result in children receiving unwanted attention or contact from strangers.

4 We recognise that organisations also have obligations to protect children under the Online Safety Act (OSA). Ofcom is the regulator for the OSA, which includes provisions to ensure that children are not normally able to encounter pornographic content and to protect children from harmful content online, including on VSPs and SMPs. 

5 It was possible that some subsequently used a form of profiling for age assurance purposes, but this was not clear from our review. The code also states that organisations can either apply all standards of the code to all users or implement age assurance appropriate to the data protection risks on their platforms.

6 We sought further information on this in our call for evidence, however we did not receive clear evidence about its potential effectiveness.

7 As part of this, we will be looking at the use of profiling for age assurance where users do not need to sign in to an account, including the processing of the information of children under 13s where verified parental consent may be required. 

8 We have published case studies that organisations have shared with us, which illustrate different approaches taken to providing age assurance or age-appropriate experiences for children.