Funding Matters: Huawei is being accused of collaborating in genocide. So why are UvA and VU’s scientists working with the company?

Origi­nal post 

Summary

The Univer­sity of Amster­dam (UvA), and the Vrije Univer­sity of Amster­dam have esta­blis­hed a colla­bo­ra­tion with Huawei, a global tech company head­quar­te­red in China. The firm will contri­bute funding and data to a project called the DReaMS lab which will work on impro­ving search engine tech­no­logy. This colla­bo­ra­tion is deeply trou­bling: Huawei is acti­vely invol­ved in seri­ous human rights viola­ti­ons in China against the Uighur people, who are brin­ging a claim of geno­cide against China through the UK courts. They are also accu­sed of setting up survei­llance infras­truc­tu­res and programs which have been used against acti­vists and civil soci­ety. There is evidence from multi­ple sour­ces that shows this part­ners­hip has not been subject to appro­pri­ate ethics review, and should not have been appro­ved. We join initi­a­ti­ves before us in raising our concerns with respect to this colla­bo­ra­tion and call for a more effec­tive and meaning­ful ethics review of such rese­arch part­ners­hips by univer­si­ties; for Dutch insti­tu­ti­ons not to engage with Huawei as a part­ner, and for a public debate on how such enga­ge­ments with compa­nies should be pursued.

State­ment

A colla­bo­ra­tion with industry part­ners?

How should acade­mic rese­ar­chers colla­bo­rate with the private sector? Incre­a­singly acade­mic funders such as the NWO and the Euro­pean Commis­sion demand that rese­ar­chers colla­bo­rate with compa­nies and source rese­arch funding from the private sector, and with the colla­bo­ra­tion between the Univer­sity of Amster­dam (UvA), the Vrije Univer­sity of Amster­dam (VU) and Huawei, it beco­mes clear that this policy provi­des no safe­guards against colla­bo­ra­ti­ons with those compli­cit in grave human rights viola­ti­ons. Huawei will invest 3,5 million euro and will contri­bute data to the new ‘DRe­aMS Lab’, which will work on search engine opti­mi­sa­tion. There has been resis­tance from, among others, employees of the UvA and VU, and the UvA’s Centrale Onder­ne­mings­raad, which repre­sents univer­sity employees as a group. The COR says:  ‘colla­bo­ra­tion implies that the company is welcome in the West and appe­ars to legi­ti­mise its other prac­ti­ces, despite its dubi­ous recent history’.

We would like to empha­sise that our objec­tion to this colla­bo­ra­tion is not about drawing a simple line between tech compa­nies of the West versus “the rest”, but about asses­sing propo­sed colla­bo­ra­ti­ons with all private insti­tu­ti­ons by meaning­ful stan­dards that include human costs.

Who are the colla­bo­ra­tors?

The DReamS Lab is led by three rese­ar­chers at VU and UvA, who are also members of the KNAW and one of whom co-deve­lo­ped the Dutch AI Rese­arch Agenda [https://www.nwo.nl/en/news-and-events/news/2019/11/first-nati­o­nal-rese­arch-agenda-for-arti­fi­cial-in…]. The DReaMS Lab will conduct rese­arch on infor­ma­tion retri­e­val — popu­larly known as search. The project leaders explain that Huawei has a rene­wed inter­est in infor­ma­tion retri­e­val due to geopo­li­ti­cal shifts, namely a US boycott that bloc­ked Huawei’s access to Google servi­ces, inclu­ding search. The company, by provi­ding access to its data and inter­nal servi­ces, expects to bene­fit from the rese­arch results as well as access to “talent”. 

The funder, Huawei, is a global tech­no­logy company that also main­tains a global track record in human rights viola­ti­ons. In China, Huawei has been using Uighur slave labour provi­ded by the govern­ment and has been contrac­ting to provide the survei­llance appa­ra­tus used to control and perse­cute the Uighur popu­la­tion – acti­vi­ties for which the Chinese govern­ment is now facing char­ges of geno­cide in the UK courts and calls for the UN to inves­ti­gate on the part of nume­rous inter­na­ti­o­nal orga­ni­sa­ti­ons. Huawei has not merely offe­red tech­ni­cal support to the Chinese govern­ment’s perse­cu­tion of the Uighur people: the firm has been shown to be acti­vely enga­ged in R&D with the secu­rity servi­ces in Xing­ji­ang and with media provi­ders on projects to mani­pu­late public opinion in the province.

Huawei also contracts with govern­ments in other coun­tries to provide ille­gi­ti­mate spying soft­ware and servi­ces. In Uganda, Huawei engi­ne­ers have confi­gu­red NSO group spyware for the govern­ment: NSO group has been heavily criti­ci­zed after rese­arch by the renow­ned Citi­zen Lab (Univer­sity Toronto) showed that the soft­ware was used to target jour­na­lists (amongst others, New York Times jour­na­list Ben Hubbard) and human rights defen­ders. Its “Safe City” program, the company’s “compe­ti­tive” survei­llance product in the inter­na­ti­o­nal market to “improve poli­cing efforts” in cities, has raised further concerns with civil soci­ety and acti­vists across the world.

What was the review process for this colla­bo­ra­tion?

Neit­her of the two univer­si­ties shows any sign of having weig­hed the ethi­cal or poli­ti­cal impli­ca­ti­ons of this deal. Instead the part­ners­hip has been scru­ti­ni­sed on the basis of a limi­ted unders­tan­ding of acade­mic inte­grity: will the rese­ar­chers be free to publish their findings without commer­cial claims or edito­rial hindrance, will the rese­arch groups in ques­tion be free to choose their own staff, and will the products be free and open for others to use? The answer to the first two ques­ti­ons is yes, but to the third, no – Huawei will receive a commer­cial inter­est in the results of the rese­arch since they will have the right to apply for patents on the rese­arch outputs  [https://www.adval­vas.vu.nl/opinie/legi­ti­me­ert-de-vu-de-onder­druk­king-van-de-oeigo­e­ren]. 

The govern­ment on the other hand has raised concerns about nati­o­nal secu­rity, as a response to which data manage­ment plans have been revi­e­wed by the Dutch inte­lli­gence servi­ces. The AIVD [the Dutch nati­o­nal secu­rity agency] has clea­red the univer­si­ti­es’ coope­ra­tion with Huawei concerns exclu­si­vely in rela­tion to the nati­o­nal secu­rity of the Nether­lands. Both these appro­a­ches – focu­sing on acade­mic inte­grity and nati­o­nal secu­rity – miss the point. As the COR points out, by colla­bo­ra­ting with Huawei the UvA and VU are contri­bu­ting somet­hing price­less to the company: legi­ti­macy. If Huawei wants to operate in the EU it needs high-profile allies and projects to direct atten­tion away from its poli­tics.

The rese­ar­chers leading the AI project reas­sure the press that ‘the same rules and guaran­tees hold’ as with their colla­bo­ra­ti­ons with ‘ING, Ahold and Else­vi­er’. But none of these compa­nies are currently being accu­sed of colla­bo­ra­ting in geno­cide. Rather than seeking to ask the obvi­ous ques­ti­ons of this part­ners­hip, both univer­sity and govern­men­tal autho­ri­ties are working hard to avert any discus­sion of the bigger picture. A leader of the lab has said that ‘the crucial risk is of a derai­ling social contro­versy’ (‘onts­po­rende maats­chap­pe­lijke discus­sie’), inad­ver­tently acknow­led­ging that a discus­sion that focu­ses on the actual issues in play might indeed endan­ger the colla­bo­ra­tion. We reject the idea that as rese­ar­chers we have a respon­si­bi­lity not to rock the boat, or that we are betraying our univer­si­ties by ques­ti­o­ning the enthu­si­as­tic welcome this company is recei­ving into our acade­mic commu­nity. 

What should be the scope of an ethics analy­sis?

Ruha Benja­min in Race After Tech­no­logy shows how, while many firms are compli­cit in perse­cu­tion and human rights abuses, others acti­vely promote them. She offers the exam­ple of Pola­roid which during the apar­theid era in South Africa deve­lo­ped new photo-iden­ti­fi­ca­tion tech­ni­ques to capture black faces for the pass-books that were used to restrict the free­dom of black citi­zens. Its support of the South Afri­can regi­me’s aims can be compa­red to the colla­bo­ra­tion between IBM and the Nazi govern­ment in Germany in the 1930s, where the firm provi­ded inno­va­tive tech­ni­cal and finan­cial cons­truc­ti­ons to capture the market for iden­tifying and loca­ting the groups eradi­ca­ted in the Holo­caust.

Compu­ter science, as a field of ever-growing impor­tance, has yet to resolve its ethics problem. Most recently, we have obser­ved how rese­arch ethics in AI is acti­vely shaped by econo­mic prio­ri­ties, requi­ring an urgent update so that it can withs­tand this pres­sure. Meaning­ful ethi­cal review must take account of human conse­quen­ces, yet if the data to be supplied by Huawei to the new lab were consi­de­red by an ethics review board, it is unli­kely it would be flag­ged as proble­ma­tic. There is a well-docu­men­ted mismatch between conven­ti­o­nal rese­arch ethics and prac­ti­ces in big data and AI deve­lop­ment, where the subjects of rese­arch are often too far downs­tream of the rese­arch to be consi­de­red at risk of harm. In this case, we have no infor­ma­tion about how the data­sets that will be shared have come into being, who the downs­tream subjects of this rese­arch will be, or which values will be prio­ri­ti­sed due to the colla­bo­ra­tion. How can rese­arch ethics assess harm­ful (side) effects more effec­ti­vely and include a trans­na­ti­o­nal dimen­sion? Even if rese­arch ethics were fully up to the task, the rele­vant inde­pen­dent ethi­cal over­sight is missing at UvA. The univer­sity’s ‘gene­ral insti­tu­ti­o­nal ethics commis­si­on’ (Alge­mene Inste­llings­ge­bon­den Ethis­che Commis­sie) has repor­tedly not met in two years, has no chair­per­son and no agenda.

What next?

It is time to update univer­sity ethics proces­ses to reflect the kinds of colla­bo­ra­tion being propo­sed. UvA’s COR has called for a collec­tive Code of Conduct to be esta­blis­hed and for a univer­sity-wide Ethics Commit­tee to review all rese­arch part­ners­hips with compa­nies before they are brought to the Execu­tive Board and the COR. We support their posi­tion, and call on all the Dutch univer­si­ties to adopt this appro­ach and to insti­tute ethi­cal review on the central level that, at a mini­mum, takes into account the human rights records of propo­sed colla­bo­ra­tion part­ners. The case of Huawei, as well as other compa­nies from all parts of the globe, also makes clear that it is neces­sary to move beyond ethi­cal commit­tees, to a public debate. We urge Dutch insti­tu­ti­ons, inclu­ding muni­ci­pa­li­ties and other public autho­ri­ties, not to engage with Huawei as a part­ner, and the Dutch govern­ment to consi­der not only matters of acade­mic inte­grity and the secu­rity impli­ca­ti­ons of working with compa­nies, but human rights viola­ti­ons as well.

Funding Matters 

(If you want to sign on to this state­ment, send an email to signonatfunding­mat­ters [ punto ] tech (signon[at]funding­mat­ters[dot]tech) indi­ca­ting your name and affi­li­a­tion. If you have other ques­ti­ons, send an email to enqui­riesatfunding­mat­ters [ punto ] tech (enqui­ries[at]funding­mat­ters[dot]tech) )

Co-signed:

Global Data Justice project – Tilburg Univer­sity

DATAC­TIVE rese­arch project – UvA 

Faculty Student Coun­cil of the Faculty of Huma­ni­ties (FSR FGw) – UvA

Lonneke van der Velden – UvA

Niels ten Oever – UvA

Stefa­nia Milan – UvA

Linnet Taylor – Tilburg Univer­sity

Seda Gürses – TU Delft

Hans de Zwart – Amster­dam Univer­sity of Applied Scien­ces

Anne­lies Moors – UvA

Thomas Poell – UvA

Joris van Hobo­ken – UvA

Sarah Eskens – UvA

Olav Velt­huis (AISSR) – UvA

Matt­hijs Koot – UvA

Alex Gekker – UvA

Zazie van Dorp – UvA

Marjo­lein Lanzing – UvA

Naomi Appel­man – UvA

Kris­tina Irion – UvA

Jill Toh – UvA

Ot van Daalen – UvA

Miriyam Aouragh – Univer­sity of West­mins­ter

Angela Wigger – Radboud Univer­sity

Tamar Sharon – Radboud Univer­sity 

Bart Jacobs – Radboud Univer­sity 

Esther Keymo­len – Tilburg Univer­sity

Aviva de Groot – Tilburg Univer­sity

Aaron Martin – Tilburg Univer­sity

Gijs van Maanen – Tilburg Univer­sity

Tobias Fiebig – TU Delft

Jaap-Henk Hoep­man – Radboud Univer­sity and Univer­sity of Gronin­gen

Fran­cien Dechesne – Leiden Univer­sity

Caro­lina Fros­sard – UvA

Luiza Biala­si­e­wicz – UvA

Nadya Purtova, Tilburg Univer­sity