Skip to main content

Case study: North Ayrshire Council schools - use of facial recognition technology

Contents

An education authority is considering whether to use Facial Recognition Technology (FRT) to verify pupils’ identities at the cash register during lunch, in order to deduct money from an online account and facilitate cashless catering. FRT involves processing pupils’ special category biometric data in the form of a facial template, which is a mathematical representation of a pupil’s facial features that would be matched with an image taken at the cash register.

Before engaging with an FRT supplier or processing any personal data for this purpose, the education authority must first carry out a Data Protection Impact Assessment (DPIA). This is important because the FRT will process the special category biometric data of vulnerable data subjects (children) using an innovative technology.

This type of processing presents risks to pupils’ rights and freedoms, particularly around bias and discrimination, and children are often less aware of the risks that may be involved when their personal data are being collected and processed. They therefore merit particular protection under data protection law.

The education authority must be able to evidence that the use of the FRT is a necessary and proportionate way to manage payments for a school lunch service. This includes considering whether less intrusive alternatives are available.

The education authority determines that it will rely on consent for the processing. In order to get valid consent the education authority must ensure that it is freely given, specific, informed and unambiguous – regardless of whether they seek it from the pupil or the parent/carer. If the pupil or parents/carers refuse to provide their consent, the school must give pupils a genuine alternative to the FRT. Alternatives might include swipe cards, pin numbers or cash, and the choice to use these must not result in any detriment to the pupil (eg they must not have to wait longer to buy or pay more for lunch).

Education authorities in England and Wales must also apply the Protection of Freedoms Act 2012,  which sets out rules about parental and child consent for the use of biometrics in schools. These obligations need to be met in addition to data protection requirements. The provisions do not apply in Scotland or Northern Ireland.

The FRT processes special category biometric data and this must be explained to pupils or those parents/carers. The education authority must explain the risks associated with the processing and any mitigations that it has put in place. The education authority must make it clear what they will do with that data, whether there is any sharing of personal information with a supplier, and the retention policy for the data. The education authority must also explain the risks related to the processing; technologies such as FRT rely on the use of AI and algorithms and present two key risks - bias and discrimination. There may be other risks and our work on data protection harms supports the risk identification process.

If the education authority wished to rely on public task as its lawful basis for the processing, it must demonstrate that the use of FRT was ‘necessary,’ ie that it is targeted and proportionate and that the processing of special category biometric data to fulfil that task was clear and foreseeable. It is unlikely that the use of FRT could be considered necessary for the task of providing school lunches.  

In any event, as the FRT processes special category data, the education authority must identify an Article 9 condition for processing. It needs to rely on Explicit Consent to process the biometric facial templates as the other conditions are not applicable in this context.

The education authority will need to comply with its transparency obligations by creating a children friendly privacy notice that contains all the information required under UK GDPR. We have guidance on processing children’s data with a specific section on the right to be informed and how it applies to children.

The education authority should assure itself that the systems have been trained on a representative data sample, and bias testing conducted. The education authority should then monitor the system throughout its life cycle and, if biases emerge, make any improvements necessary. The education authority should not proceed unless it is confident that the data accuracy principle can be complied with and risks associated with bias and discrimination can be managed.

Further guidance can be found in our blog on Human bias and discrimination in AI systems and in our Guidance on AI and data protection.