The ICO exists to empower you through information.

Your assessment must be evidence-based, and you must strive to be as rigorous and objective as possible. But data protection law and the code are not prescriptive about the tools you must use to assess impacts on children - you have freedom to use the approaches and evidence sources that are best suited to your context. Evidence resources you could use include:

  1. Consultation with children and parents. The code’s DPIA and best interests standards encourage you to consult children and parents on their needs and views on how you intend to use their data. Approaches for doing this include user surveys, primary research and focus groups, co-design workshops and engagement with youth panels. Our guidance for designers provides artefacts and tools to support you to engage with children and parents in the context of the code.
  2. Academic and grey literature. Journal articles, academic publications, research by child advocates and civil society and other open research sources provide a general evidence base on risks and benefits of online services to children. The United Nations Digital Library and UNCRC general comment database also offer general theoretical background on the individual rights children hold under the UNCRC.
  3. Scenario-based tools. Scenario-based tools offer a structured process for considering how and when different impacts may arise. They do this by developing a range of hypothetical present and future scenarios to consider (e.g. best-case, most likely and worst-case). They can also support you to identify events and underlying drivers that could trigger these scenarios. For example, through threat modelling or “back-casting” to identify root causes of risks. Some scenario-based tools and approaches are available on an open-source basis.
  4. User redress and feedback data. Data from mechanisms allowing children and parents to feedback on your service can be a rich source of evidence. For example, through complaints, requests to exercise data rights, or requests for help. Some online services publish transparency reports on levels of harm and complaints, which also provide a comparative benchmark for you to refer to.

    Note that such evidence is less suitable for assessing impacts that are less visible to children and parents, such as those arising from the use of algorithms.
  5. Engagement with children’s development and rights specialists. Consultation with third parties with proven expertise in children’s developmental needs and rights and online risks to children can give you assurance on your assessment.
  6. External audit and review. Consider commissioning a relevant third party to assess the potential impact of your services on children’s rights.

Regardless of the approach you take to develop your evidence-base, you should always have the following considerations in mind when assessing likelihood and severity:

  • The age ranges of your child users will influence your assessment. In general, the capacity of children to understand and respond to impacts on their rights will increase as they get older and their capacities develop. The severity of risks (and scale of benefits) will be influenced by the development stages of your users in some cases. For example, targeted content of age-inappropriate goods like alcohol that impede physical development. The code’s Annex on age and developmental stages gives guidance on considerations across children’s development stages. Our “creating age-appropriate mindsets” workshop supports you to think about different user needs for a range of scenarios.
  • Detrimental uses of data. The code’s detrimental use of data standard states that you must not use data in ways that are demonstrably against the wellbeing of children. This is defined by relevant external bodies, for example the Chief Medical Officer and Public Health England. Risks that come with a tangible chance of breaching such standards are intolerable. You should consider any relevant ones for each of the risks to rights you identify. Our Detrimental use of data articles can assist you with this.
  • High risk data processing. Certain code standards were included in recognition of the fact that these forms of data processing pose particularly high risks to children. For example, data sharing, profiling, geolocation and connected toys and devices. Our Examples of processing likely to result in high risk details other activities where we believe risks to individuals are particularly high. Whilst this list is not specific to children, it still applies to them. If you are undertaking any of these processing activities, you need to introduce measures to significantly lower these risks, and develop an evidence-base to demonstrate their effectiveness.