News and Publications

Biometric recognition technologies: what are your data protection obligations?

Posted: 27/01/2025


Biometric recognition technologies offer enhanced security, efficiency and convenience, but their use also brings enhanced data protection obligations. Organisations using these systems, particularly educational institutions, must be alert to the risks they pose and the steps that must be taken to mitigate these. 

Why does the Information Commissioner’s Office take the processing of biometric data so seriously?

Many organisations use systems that involve the processing of biometric data, such as fingerprint recognition, facial recognition, iris recognition and voice recognition.

Where biometric data are used to identify individuals, they are considered ‘special category data’ under the UK General Data Protection Regulation (the UK GDPR), and the data controller is subject to additional hurdles in order to demonstrate that they are complying with their data protection obligations. 

Biometric recognition systems introduce specific risks. They work by comparing two sets of biometric features: the biometric reference and a newly created biometric sample. The biometric reference and the new biometric sample will never be exactly the same, even when they belong to the same person, so the system produces an estimate based on the probability that they belong to the same person. The fact that the system relies on probability creates the potential for two types of errors: false biometric acceptance (false positives) and false biometric rejection (false negatives). 

Just like human decision-making, automated processes are prone to biases. If the technology has been trained on a data set that is not representative of the context in which it is to be used, for example, it is likely to be less effective at identifying certain groups. This can result in bias in the algorithm and, ultimately, in discrimination against those groups. Some biometric recognition systems have been found to be biased on the basis of age, race and gender.

Chelmer Valley High School

In July 2024, the ICO issued a reprimand to a school in Essex for its use of facial recognition technology to take cashless payments from pupils in its canteen.

Chelmer Valley High School, in Chelmsford, was reprimanded for beginning to use the technology without first carrying out a data protection impact assessment (DPIA) and for failing to properly obtain permission to process pupils’ biometric data. The ICO noted that the school had also failed to seek opinions from its data protection officer or to consult with parents and pupils before implementing the technology. 

By implementing facial recognition technology without first carrying out a DPIA, the school failed to make any assessment of the risks that might be presented to its pupils by the technology. This type of processing presented a risk to pupils’ rights and freedoms, particularly around bias and discrimination.

What was wrong with the school’s attempt to seek consent to the processing?

The school had sent parents a slip to return if they did not want their child to use the facial recognition technology. ‘Opt-out’ consent, such as this, is not a valid form of consent for the purposes of UK data protection law. By failing to properly obtain valid consent, the school had failed to give its pupils the opportunity to decide whether they did or did not want their data to be used in this way. Children are often less aware of the risks that may be involved when their personal data are collected and processed, so these risks must be explained to them in a way that they can understand. 

In addition, most of the pupils were old enough to provide their own consent, so seeking consent from their parents or carers deprived those pupils of the ability to exercise their own rights over their personal data. 

What should the school have done differently?

Importantly, the ICO has not said that biometric recognition systems, such as facial or fingerprint recognition technologies, should never be used in schools or other educational institutions. Rather, institutions should take proper steps to ensure that risks are assessed and addressed, and valid consent obtained, before these systems are utilised.

Here is a summary of the steps the school should have taken, which is applicable to the use of any system that involves the processing of pupils’ biometric data.

  • First, the school should have carried out a DPIA, because the technology involved the processing of special category data of vulnerable data subjects (children).
  • The school needed to evidence that the use of the technology was a necessary and proportionate way to manage payments for a school lunch service.
  • The school should have sought informed and unambiguous consent to the use of the technology. This means explaining to pupils and parents/carers the risks associated with the technology and any mitigations that the school had put in place. It should also have made clear what it would do with the data that were collected, whether there would be any sharing of data with a supplier and the data retention policy.
  • Where consent was refused, the school would need to give those pupils a genuine alternative (such as a swipe card) that would not cause them any detriment.
  • The school needed to identify a UK GDPR Article 9 condition to justify the processing of special category data. It would need to rely on ‘explicit consent’ as none of the other conditions are applicable.
  • The school needed to create a children friendly privacy policy in order to comply with its transparency obligations.
  • The school needed to assure itself that the system had been trained on a representative data sample and that suitable bias testing had been conducted. It should then monitor the system throughout its life cycle and make any necessary improvements should any biases emerge.

Arrow GIFReturn to news headlines

Penningtons Manches Cooper LLP

Penningtons Manches Cooper LLP is a limited liability partnership registered in England and Wales with registered number OC311575 and is authorised and regulated by the Solicitors Regulation Authority under number 419867.

Penningtons Manches Cooper LLP