In conversation with my teenage daughter last week, I pointed out a news report which flagged concerns over the use of facial recognition technologies in several school canteens in North Ayrshire, Scotland. Nine schools in the area recently launched this practice as a means to take payment for lunches more quickly and minimise COVID risk, though they’ve since paused rolling out the technology.
When I asked my daughter if she would have any concerns about the use of facial recognition technology in her school canteen, she casually replied: “Not really. It would make things a lot faster at checkout though.”
Her words validate the concern that children are much less aware of their data rights compared to adults. And although there are special provisions and safeguards for children under a range of data protection legislations, the use of facial recognition technology on children could pose unique privacy risks.
Facial recognition technologies identify and authenticate people’s identities by detecting, capturing and matching faces to images from a database. The technologies are powered by artificial intelligence (AI), specifically the technology known as machine learning.
Machine learning predicts outcomes based on historical data, or algorithms, that have been fed into the system. So for facial recognition, machine learning predicts the identity associated with a digital representation of a person’s face, or “face print”, based on a database of facial images. The software adapts through this experience, in time learning to generate predictions more easily.
Facial recognition technology is now used in a variety of ways, such as to verify the identity of employees, to unlock personal smartphones, to tag people on social media platforms like Facebook, and even for surveillance purposes in some countries.
Facial recognition technology on its own is not the problem. Rather, the issue is how it’s used and, in this instance, the fact the technology has now infiltrated school corridors and targeted a vulnerable demographic: children.
So what are the privacy issues for children?
Your face print is your data, so for any facial recognition system it’s important to understand how the image databases are collated and stored. Although I may grudgingly agree to the use of facial recognition technology to enter a concert venue, I wouldn’t be thrilled if my face print was retained for “other commercial purposes of the company” (a phrase that appears quite commonly in the fine print of ticket sales regarding the use of personal data).
If facial recognition technology is used in school settings, we’ll need clear information as to if and how students’ images will be used beyond the purpose of the lunch queue. For example, are they going to be shared with any third parties, and for what purpose? Issues could arise, say, if face prints are linked to other data on the child, like their lunch preferences. Third parties could theoretically use this data for marketing purposes.
We would also need information as to how the images would be protected. If the students’ face prints aren’t properly secured, or the system isn’t robust enough to fend off hackers, this creates cyber-security risks. It may be possible for hackers to link children’s face prints to other data about them, and track them.