Facial recognition in schools: here are the risks to children

In conver­sa­tion with my teenage daugh­ter last week, I poin­ted out a news report which flag­ged concerns over the use of facial recog­ni­tion tech­no­lo­gies in seve­ral school cante­ens in North Ayrs­hire, Scot­land. Nine scho­ols in the area recently laun­ched this prac­tice as a means to take payment for lunches more quickly and mini­mise COVID risk, though they’ve since paused rolling out the tech­no­logy.

When I asked my daugh­ter if she would have any concerns about the use of facial recog­ni­tion tech­no­logy in her school canteen, she casu­ally replied: “Not really. It would make things a lot faster at chec­kout though.”

Her words vali­date the concern that chil­dren are much less aware of their data rights compa­red to adults. And although there are special provi­si­ons and safe­guards for chil­dren under a range of data protec­tion legis­la­ti­ons, the use of facial recog­ni­tion tech­no­logy on chil­dren could pose unique privacy risks.

Facial recog­ni­tion tech­no­lo­gies iden­tify and authen­ti­cate people’s iden­ti­ties by detec­ting, captu­ring and matching faces to images from a data­base. The tech­no­lo­gies are powe­red by arti­fi­cial inte­lli­gence (AI), speci­fi­cally the tech­no­logy known as machine lear­ning.

Machine lear­ning predicts outco­mes based on histo­ri­cal data, or algo­rithms, that have been fed into the system. So for facial recog­ni­tion, machine lear­ning predicts the iden­tity asso­ci­a­ted with a digi­tal repre­sen­ta­tion of a person’s face, or “face print”, based on a data­base of facial images. The soft­ware adapts through this expe­ri­ence, in time lear­ning to gene­rate predic­ti­ons more easily.

Facial recog­ni­tion tech­no­logy is now used in a vari­ety of ways, such as to verify the iden­tity of employees, to unlock perso­nal smartp­ho­nes, to tag people on social media plat­forms like Face­book, and even for survei­llance purpo­ses in some coun­tries.

Facial recog­ni­tion tech­no­logy on its own is not the problem. Rather, the issue is how it’s used and, in this instance, the fact the tech­no­logy has now infil­tra­ted school corri­dors and targe­ted a vulne­ra­ble demo­grap­hic: chil­dren.

So what are the privacy issues for chil­dren?

Your face print is your data, so for any facial recog­ni­tion system it’s impor­tant to unders­tand how the image data­ba­ses are colla­ted and stored. Although I may grud­gingly agree to the use of facial recog­ni­tion tech­no­logy to enter a concert venue, I wouldn’t be thri­lled if my face print was retai­ned for “other commer­cial purpo­ses of the company” (a phrase that appe­ars quite commonly in the fine print of ticket sales regar­ding the use of perso­nal data).

If facial recog­ni­tion tech­no­logy is used in school settings, we’ll need clear infor­ma­tion as to if and how students’ images will be used beyond the purpose of the lunch queue. For exam­ple, are they going to be shared with any third parties, and for what purpose? Issues could arise, say, if face prints are linked to other data on the child, like their lunch prefe­ren­ces. Third parties could theo­re­ti­cally use this data for marke­ting purpo­ses.

We would also need infor­ma­tion as to how the images would be protec­ted. If the students’ face prints aren’t properly secu­red, or the system isn’t robust enough to fend off hackers, this crea­tes cyber-secu­rity risks. It may be possi­ble for hackers to link chil­dren’s face prints to other data about them, and track them.

The heigh­te­ned privacy risk surroun­ding the use of facial recog­ni­tion tech­no­lo­gies in scho­ols also rela­tes to infor­med consent. Although UK data protec­tion law speci­fies that chil­dren aged 13 and over can consent to the proces­sing of their perso­nal data, this doesn’t mean they fully unders­tand the impli­ca­ti­ons. For exam­ple, one survey found chil­dren between ages eight and 15 had diffi­culty unders­tan­ding the terms and condi­ti­ons of Insta­gram.

Chil­dren, parents and guar­di­ans should be provi­ded with nothing less than full infor­ma­tion, couched in language chil­dren can easily unders­tand. Any data subject, inclu­ding a child, has the right to know exac­tly how their perso­nal data will be proces­sed, shared, and stored, and can specify the condi­ti­ons under which their consent will apply. Anyt­hing less than prudence and trans­pa­rency will risk jeopar­di­sing chil­dren’s privacy.

Norma­li­sing the survei­llance of chil­dren?

These are just some of the ques­ti­ons the use of facial recog­ni­tion tech­no­lo­gies in scho­ols raises. Facial recog­ni­tion tech­no­logy also carries other risks, such as errors, which could, for exam­ple, lead to students being char­ged incor­rectly. And as with any AI system, we should be concer­ned about whet­her the algo­rithms and data sets are free from bias, and have clean, complete and repre­sen­ta­tive trai­ning data.

Impor­tantly, employing facial recog­ni­tion tech­no­lo­gies in scho­ols also goes some way to norma­li­sing the survei­llance of chil­dren. It’s possi­ble the know­ledge they are being trac­ked in this way could impact some chil­dren’s well­being.

It’s not surpri­sing that the UK’s data watch­dog, the Infor­ma­tion Commis­si­o­ner’s Office, has step­ped in to inves­ti­gate the use of facial recog­ni­tion tech­no­lo­gies in school lunch queues. And in light of the inquiry, it’s plea­sing to see North Ayrs­hire Coun­cil has paused rolling out the prac­tice.

But as we move further into the digi­tal age, it’s possi­ble the use of facial recog­ni­tion tech­no­lo­gies among scho­ol­chil­dren will resume, and even be taken up more widely. If this is to happen, the use of facial recog­ni­tion must yield subs­tan­ti­ally more bene­fits than risks, taking into account the special circums­tan­ces of using the tech­no­logy on chil­dren.