Members of the European Parliament: will you stand up for our rights?

Today, a global coali­tion of 53 civil soci­ety orga­ni­sa­ti­ons have joined toget­her to call on Members of the Euro­pean Parli­a­ment to use their demo­cra­ti­cally-elec­ted powers to protect us all from biome­tric mass survei­llance prac­ti­ces. The EU must not legi­ti­mise these dange­rous prac­ti­ces. Other­wise, EU lawma­kers risk setting a prece­dent for uses of AI-based tech­no­logy which could destroy people’s anony­mity fore­ver and suppress a broad range of our rights and free­doms.

 

Dear honou­ra­ble Members of the Euro­pean Parli­a­ment,

We write to you today as 53 orga­ni­sa­ti­ons to ask: Will you stand up for our rights by prohi­bi­ting biome­tric mass survei­llance in the Arti­fi­cial Inte­lli­gence Act?

In Europe and across the world, the use of remote biome­tric iden­ti­fi­ca­tion (RBI) systems such as facial recog­ni­tion, in our publicly acces­si­ble spaces, repre­sents one of the grea­test thre­ats to funda­men­tal rights and demo­cracy that we have ever seen.

The remote use of such systems destroys the possi­bi­lity of anony­mity in public, and under­mi­nes the essence of our rights to privacy and data protec­tion, the right to free­dom of expres­sion, rights to free assembly and asso­ci­a­tion (leading to the crimi­na­li­sa­tion of protest and causing a chilling effect), and rights to equa­lity and non-discri­mi­na­tion.

Without an outright ban on the remote use of these tech­no­lo­gies in publicly acces­si­ble spaces, all the places where we exer­cise our rights and come toget­her as commu­ni­ties will be turned into sites of mass survei­llance where we are all trea­ted as suspects.

 

These harms are not hypot­he­ti­cal. Uyghur Muslims have been syste­ma­ti­cally perse­cu­ted by the Chinese govern­ment through the use of facial recog­ni­tion. Pro-demo­cracy protes­ters and poli­ti­cal oppo­nents have been suppres­sed or targe­ted in Russia, Serbia and Hong Kong through the use – and in some cases, even just the fear of the use of – RBI in publicly-acces­si­ble spaces. And many people have been wrong­fully and trau­ma­ti­cally arres­ted around the world.1

In response to the ever-incre­a­sing proli­fe­ra­tion of these uses and their harms, people are pushing back and calling for prohi­bi­ti­ons. More than 24 US states have taken steps against facial recog­ni­tion or other forms of biome­tric mass survei­llance. In South America, two recent rulings in São Paulo and Buenos Aires have orde­red the suspen­sion of facial recog­ni­tion systems.

Some of the world’s biggest provi­ders of biome­tric survei­llance systems – Micro­soft, Amazon and IBM – have even adop­ted self-impo­sed mora­to­riums due to the major risks and harms that they know their systems perpe­tu­ate; and Face­book has dele­ted its mass facial image data­base.

Despite the strong protec­ti­ons affor­ded to biome­tric data in EU data protec­tion law, we see compa­nies and public autho­ri­ties syste­ma­ti­cally misu­sing “consent” and vague secu­rity justi­fi­ca­ti­ons as a basis for the use of facial recog­ni­tion and other biome­tric systems in ways that amount to inhe­rently dispro­por­ti­o­nate mass survei­llance prac­ti­ces.

 

While demo­cra­tic coun­tries around the world are taking steps to protect their commu­ni­ties, the EU is heading in the oppo­site direc­tion.

 

A clear, unam­bi­guous prohi­bi­tion is needed in the AI Act to put a stop to the dange­rous status quo.2 In 2021, the Euro­pean Parli­a­ment adop­ted a power­ful stance against biome­tric mass survei­llance prac­ti­ces in the AI in crimi­nal law report, which calls for: “a ban on any proces­sing of biome­tric data, inclu­ding facial images, for law enfor­ce­ment purpo­ses that leads to mass survei­llance in publicly acces­si­ble spaces” (Arti­cle 31).

The AI Act is the obvi­ous way for this impor­tant Euro­pean Parli­a­ment reso­lu­tion to be trans­la­ted into binding, impact­ful law.

The urgent need for further action has also been recog­ni­sed at EU Member State level. Italy has intro­du­ced Euro­pe’s first mora­to­rium on public facial recog­ni­tion. The German coali­tion govern­ment has called for an EU-wide ban on biome­tric mass survei­llance prac­ti­ces. Portugal drop­ped a law which would have lega­li­sed some biome­tric mass survei­llance prac­ti­ces. And the Belgian Parli­a­ment is consi­de­ring a mora­to­rium on biome­tric survei­llance.

 

Will you make (the right kind of) history?

 

There is alre­ady signi­fi­cant evidence that Euro­pean resi­dents have­been syste­ma­ti­cally subjec­ted to biome­tric mass survei­llance prac­ti­ces. From foot­ball fans, to school chil­dren, to commu­ters, to shop­pers, to people visi­ting LGBTQ+ bars and places of wors­hip, the harms are real and preva­lent. Via the Reclaim Your Face campaign, over 70,000 EU citi­zens urge you and your fellow lawma­kers to better protect us from these unde­mo­cra­tic and harm­ful biome­tric systems.

Around the world, over 200 civil soci­ety orga­ni­sa­ti­ons, from Burundi to Taiwan, have signed a letter calling for a global ban on biome­tric survei­llance. As the first region to compre­hen­si­vely regu­late arti­fi­cial inte­lli­gence, the EU’s acti­ons – or inac­tion – will have major rami­fi­ca­ti­ons on biome­tric mass survei­llance prac­ti­ces in every corner of the globe.

While dozens of US states are lear­ning from horren­dous mista­kes such as the facial recog­ni­tion-enabled suppres­sion of Black Lives Matter protes­ters, govern­ments in India, China and Russia are moving in the oppo­site direc­tion. Which side of history will the EU be on: legi­ti­mi­sing autho­ri­ta­rian tech­no­lo­gi­cal survei­llance, or choo­sing funda­men­tal rights?

 

How can we make this a reality in the AI Act?

The AI Act must prohi­bit all remote(i.e. gene­ra­li­sed survei­llance) uses of biome­tric iden­ti­fi­ca­tion (RBI) in publicly-acces­si­ble spaces. This means that uses like unloc­king a smartp­hone or using an ePass­port gate would not be prohi­bi­ted. While Arti­cle 5(1)(d) alre­ady aims to prohi­bit some uses of RBI, its scope is so narrow and contains so many excep­ti­ons that it prac­ti­cally provi­des a legal basis for prac­ti­ces that should, in fact, alre­ady be prohi­bi­ted under exis­ting data protec­tion rules.

We there­fore call on you to propose amend­ments to Arti­cle 5(1)(d)3 which would:

  • Extend the scope of the prohi­bi­tion to cover all private as well as public actors;
  • Ensure that all uses of RBI (whet­her real-time or post) in publicly-acces­si­ble spaces are inclu­ded in the prohi­bi­tion; and
  • Delete the excep­ti­ons to the prohi­bi­tion, which inde­pen­dent human rights assessments confirm do not meet exis­ting EU funda­men­tal rights stan­dards.

To ensure a compre­hen­sive appro­ach to the protec­tion of biome­tric data, we addi­ti­o­nally urge you to use the oppor­tu­nity provi­ded by the AI Act to put a stop to discri­mi­na­tory or mani­pu­la­tive forms of biome­tric cate­go­ri­sa­tion, and to properly address the risks of emotion recog­ni­tion.

The EU aims to create an “ecosys­tem of trust and exce­llence” for AI and to be the world leader in trust­wor­thy AI. Accom­plis­hing these aims will mean putting a stop to appli­ca­ti­ons of AI that under­mine trust, violate our rights, and turn our public spaces into survei­llance night­ma­res. We can promote AI that really serves people, while stam­ping out the most dange­rous appli­ca­ti­ons of this power­ful tech­no­logy.

 

That’s why the EU’s way must be to truly put people at the heart, and to put forward amend­ments to the IMCO-LIBE report on the AI Act which willensure a genuine ban on biome­tric mass survei­llance prac­ti­ces.

 

Signed,

Reclaim Your Face

Orga­ni­sa­ti­o­nal signa­to­ries:

Access Now (Inter­na­ti­o­nal)

Algo­rithm­Watch (Euro­pean)

Alter­na­tif Bili­sim (AIA- Alter­na­tive Infor­ma­tics Asso­ci­a­tion) (Turkey)

anna elbe – Weit­blick für Hamburg (Germany)

ARTI­CLE 19: Global Campaign for Free Expres­sion (Inter­na­ti­o­nal)

Asoci­a­tia pentru Tehno­lo­gie si Inter­net – ApTI (Roma­nia)

Barra­cón Digi­tal (Hondu­ras)

Big Brot­her Watch (UK)

Bits of Free­dom (the Nether­lands)

Blue­print for Free Speech (Inter­na­ti­o­nal)

Center for Civil Liber­ties (Ukraine)

Chaos Compu­ter Club (Germany)

Civil Liber­ties Union for Europe (Euro­pean)

D3 – Defesa dos Direi­tos Digi­tais (Portu­gal)

Digi­tal Rights Watch (Austra­lia)

Digi­tal­cou­rage (Germany)

Digi­tale Frei­heit (Germany)

Digi­tale Gesells­chaft (Germany)

Digi­tale Gesells­chaft CH (Swit­zer­land)

Državl­jan D / Citi­zen D (Slove­nia / Euro­pean)

Eticas Foun­da­tion (Euro­pean / Inter­na­ti­o­nal)

Euro­pean Center For Not-For-Profit Law Stich­ting (ECNL) (Euro­pean)

Euro­pean Digi­tal Rights (EDRi) (Inter­na­ti­o­nal)

Euro­pean Disa­bi­lity Forum (EDF) (Euro­pean)

Fach­be­reich Infor­ma­tik und Gesells­chaft, Gesells­chaft für Infor­ma­tik e.V. (Germany)

Fair Trials (Inter­na­ti­o­nal)

Fight for the Future (United States)

Foot­ball Suppor­ters Europe (FSE) (Euro­pean)

Hermes Center (Italy)

Hiper­de­re­cho (Perú)

Homo Digi­ta­lis (Greece)

Inter­net Law Reform Dialo­gue (iLaw) (Thai­land)

Inter­net Protec­tion Soci­ety (Russia / Euro­pean)

Inter­sec­tion Asso­ci­a­tion for Rights and Free­doms (Tuni­sia)

IT-Pol Denmark (Denmark)

Inter­na­ti­o­nal Legal Initi­a­tive (Kaza­khs­tan)

Iuri­di­cum Reme­dium (IuRe) (Czech Repu­blic)

JCA-NET (Japan)

Korean Progres­sive Network Jinbo­net (Repu­blic of Korea)

La Quadra­ture du Net (France)

Lady Lawyer Foun­da­tion (Inter­na­ti­o­nal)

LaLi­bre.net Tecno­lo­gías Couni­ta­rias (Ecua­dor / Latin America)

Ligue des droits de l’Homme (LDH) (France)

Ligue des droits humains (Belgium)

LOAD e.V. – Asso­ci­a­tion for libe­ral inter­net policy (Germany)

Masaar – Tech­no­logy and Law Commu­nity (Egypt)

Panopty­kon Foun­da­tion (Poland)

Privacy Inter­na­ti­o­nal (Inter­na­ti­o­nal)

Privacy Network (Italy)

State­watch (Europe)

Usua­rios Digi­ta­les (Ecua­dor)

Wiki­me­dia Deuts­ch­land (Germany / Euro­pean)

Wiki­me­dia France (France / Euro­pean)

Indi­vi­dual signa­to­ries:

Douwe Korff, Emeri­tus Profes­sor of Inter­na­ti­o­nal Law

Dr Vita Peacock, Anth­ro­po­lo­gist

Edson Pres­tes, Full Profes­sor, Fede­ral Univer­sity of Rio Grande do Sul (Brazil)

 

1 For exam­ple: https://www.aclu.org/news/privacy-tech­no­logy/i-did-nothing-wrong-i-was-arres­ted-anyway; https://www.nyti­mes.com/2020/12/29/tech­no­logy/facial-recog­ni­tion-misi­den­tify-jail.html; https://www.wired.com/story/wrong­ful-arrests-ai-derai­led-3-mens-lives/; https://edri.org/our-work/dange­rous-by-design-a-cauti­o­nary-tale-about-facial-recog­ni­tion/; https://www.law.geor­ge­town.edu/privacy-tech­no­logy-center/publi­ca­ti­ons/garbage-in-garbage-out-face-recog­ni­tion-on-flawed-data/

2 The Gene­ral Data Protec­tion Regu­la­tion, Arti­cle 9, para­graph 4, fore­sees addi­ti­o­nal protec­ti­ons of biome­tric data: “Member States may main­tain or intro­duce further condi­ti­ons, inclu­ding limi­ta­ti­ons, with regard to the proces­sing of … biome­tric data”.

3 This must be suppor­ted by a new Reci­tal to better define “remote” use cases as those where came­ras/devi­ces are insta­lled at a distance that crea­tes the capa­city to scan multi­ple persons, which in theory could iden­tify one or more of them without their know­ledge. Warning noti­ces do not annul such a defi­ni­tion.

 

 

TYPES OF BIOME­TRIC MASS SURVEI­LLANCE

Gene­ral Moni­to­ring

Learn how biome­tric mass survei­llance affects you, whoe­ver you are. It can be anyw­here: scho­ols, foot­ball matches, cultu­ral events and protest.

 

Predic­tive Poli­cing

Learn how biome­tric mass survei­llance thre­a­tens indi­vi­dual auto­nomy and the presump­tion of inno­cence based on your race and class.

 

 

Govern­ment Data­ba­ses

Know more about the huge biome­tric data­ba­ses that are growing endlessly in Europe and provi­ding a hidden infras­truc­ture for biome­tric mass survei­llance.

 

 

 

Borders, migrants, and failed huma­ni­ta­rism

Learn how non-EU nati­o­nals trave­ling into the EU are frequently trea­ted as a fron­tier for expe­ri­men­ta­tion with biome­tric tech­no­lo­gies.

 

 

Social Media and Online scra­ping

Learn how the photos or videos you put online end up taken and proces­sed by shady private compa­nies and used to train algo­rithms.