The Information Commissioner’s Office recently reprimanded a school for using facial recognition technology without carrying out a data protection impact assessment. Ibrahim Hasan looks at the ICO’s decision.
For a number of years schools have used biometrics, particularly fingerprint scanning, to streamline various processes such class registration, library book borrowing and cashless catering. Big Brother Watch (BBW) raised privacy concerns about this way back in 2014. Recently some schools have started to implement facial recognition technology (FRT).
FRT is even more problematic. In May, BBW launched a fundraiser to support two members of the public to bring legal challenges after FRT wrongly flagged them as criminals. And in January 2023, the ICO issued a letter to North Ayrshire Council (NAC) following their use of FRT in school canteens. The Financial Times reported that, “nine schools in North Ayrshire will start taking payments for school lunches by scanning the faces of pupils, claiming that the new system speeds up queues and is more Covid-secure than the card payments and fingerprint scanners they used previously.”
Last week the ICO issued a reprimand to Chelmer Valley High School, in Chelmsford, after it started using FRT to take cashless canteen payments from students. The ICO said that the school failed to complete a Data Protection Impact Assessment (DPIA), in compliance with of Article 35(1) of the UK GDPR, prior to introducing the system.
As readers will know, when processing any form of biometric data, a data controller requires a lawful basis under Article 6 of the UK GDPR as well as Article 9 due to the processing of Special Category Data. In most cases, the only lawful basis for FRT usage is express consent (see the GDPR Enforcement Notices issued to public service provider Serco Leisure, Serco Jersey and seven associated community leisure trusts requiring them to stop using FRT and fingerprint scanning to monitor employee attendance)
In March 2023, Chelmer Valley High School sent a letter to parents with a slip for them to return if they did not want their child to participate in the FRT. Positive opt-in consent (express consent) was not sought, meaning until November 2023 the school was wrongly relying on assumed (opt out) consent. The ICO noted most students were old enough to provide their own consent and therefore, parental opt-out deprived students of the ability to exercise their rights and freedoms.
The ICO also noted that the School has failed to consult its Data Protection Officer or the parents and students before implementing the technology. The reprimand included a set of recommendations:
- Prior to new processing operations, or upon changes to the nature, scope, context or purposes of processing for activities that pose a high risk to the rights and freedoms of data subjects, complete a DPIA and integrate outcomes back into the project plans. (see our DPIA workshop).
- Amend the DPIA to give thorough consideration to the necessity and proportionality of cashless catering, and to mitigating specific, additional risks such as bias and discrimination.
- Review and follow all ICO guidance for schools considering whether to use facial recognition for cashless catering.
- Amend privacy information given to students so that it provides for their information rights under the UK GDPR in an appropriate way. (see our Children’s Data workshop).
- Engage more closely and in a timely fashion with their DPO when considering new projects or operations processing personal data, and document their advice and any changes to the processing that are made as a result.
Ibrahim Hasan is a solicitor and director of Act Now Training.
All the recent GDPR developments will be discussed in detail on Act Now’s forthcoming GDPR Update workshop. They have a few places left on their Advanced Certificate in GDPR Practice course starting in September.