5 reasons why surveillance is a feminist issue

Image

https://www.genderit.org/femin­ist-talk/5-reas­ons-why-surveil­lance-femin­ist-issue



“Collect­ing and stor­ing data is never a neut­ral act, and neither are data analysis or predict­ive model­ling”

Repub­lished with permis­sion from the LSE Engen­der­ings blog

Above is a public domain image of full body scan­ning by milli­meter wave tech­no­logy sourced from here

Surveil­lance is woven into our every­day lives. While this in itself is not new, what we exper­i­ence today differs in scale from, say, covert surveil­lance photos of suffra­gettes, tabs on unions and protest­ers during the Cold War era, or even the prac­tices of the GDR’s Stasi.

Given the sheer vari­ety and quant­ity of data constantly accu­mu­lated about any one of us, De Lillo’s ( 1985) fictional spec­u­la­tion that “you are the sum total of your data” has proven quite vision­ary. To speak of the inform­at­isa­tion of the body, data doubles, or data bodies is no longer fiction but schol­ar­ship.

Contem­por­ary surveil­lance prac­tices are to a large extent big data driven, under­pinned by a collect-it-all logic, and ever expand­ing due to fear-monger­ing, yet pervas­ive national secur­ity discourse. Surveil­lance tech­no­lo­gies and prac­tices have not only multi­plied in scale and quant­ity. They have also qual­it­at­ively trans­formed into an “assemblage” char­ac­ter­ised by multiple sites of data collec­tion and analysis, collab­or­a­tion between differ­ent surveil­lance states as well as agen­cies, flesh/inform­a­tion flows, and the capa­city for further expan­sion and vari­ation. Others have been less eloquent and simply named the beast “surveil­lance indus­trial complex”.

While femin­ist work on surveil­lance is emer­ging academ­ic­ally as well as in activ­ist circles, all too often femin­ist issues on the one hand, and discus­sions around privacy and surveil­lance on the other still feel like separ­ate domains. What follows is my attempt at emphas­ising that think­ing them together makes a lot of sense.

Collect­ing and stor­ing data is never a neut­ral act, and neither are data analysis or predict­ive model­ling.

1. Surveil­lance is about social justice

First off, a perhaps blatantly obvi­ous point worth re-stat­ing: Surveil­lance is not a topic reserved for tech geeks and the secur­ity industry, it is decidedly a social justice issue.

Once, concerns about surveil­lance were couched primar­ily in the language of privacy and, possibly, free­dom. (…) While these issues are still signi­fic­ant, it is becom­ing increas­ingly clear to many that they do not tell the whole story. For surveil­lance today sorts people into categor­ies, assign­ing worth or risk, in ways that have real effects on their life-chances. Deep discrim­in­a­tion occurs, thus making surveil­lance not merely a matter of personal privacy but of social justice (Lyon 2003:1).

For surveil­lance today sorts people into categor­ies, assign­ing worth or risk, in ways that have real effects on their life-chances.

Think­ing about/against surveil­lance neces­sar­ily involves ques­tion­ing its under­ly­ing power rela­tions and moving away from a blind belief in the objectiv­ity of data and algorithms. Collect­ing and stor­ing data is never a neut­ral act, and neither are data analysis or predict­ive model­ling. A few years ago Laurie Penny argued that surveil­lance and patri­archy func­tion in fairly similar ways, and that there­fore the fight for the prin­ciples of free speech, the fight against surveil­lance and the fight for a soci­ety where whis­tleblowers are protec­ted, is a femin­ist fight

Femin­ist work has decades of exper­i­ence in deal­ing with precisely such ques­tions in differ­ent contexts. An insist­ence that power rela­tions matter, situ­ated know­ledges, critiques of objectiv­ity, femin­ist science and tech­no­logy stud­ies, inter­sec­tion­al­ity, work around agency and coer­cion, and vast exper­i­ence in inter­dis­cip­lin­ary work taken together make a great toolkit to inter­vene in the gendered, racial­ised, and classed effects of surveil­lance prac­tices.

2. The prob­lem with categor­ies

Categor­isa­tions along the lines of gender and sexu­al­it­ies and their inter­sec­tions with class, race, and other differ­en­ti­ations have long been femin­ist issues for very good reas­ons. The same scru­tiny of categor­ies needs to be applied to their use in tech­no­lo­gies of surveil­lance, where their norm­al­ising effects are no less chilling than else­where.

Contem­por­ary surveil­lance heav­ily relies on stat­ist­ical categor­ies and algorithms, result­ing in effect­ive mech­an­isms of social sort­ing. In addi­tion to gaps in the data that are filled by assump­tions (guess who’s), stat­ist­ical categor­ies by defin­i­tion oper­ate by prox­im­ity to a norm. Unsur­pris­ingly, the unspoken norm all data bodies are meas­ured against is once again white cis-male hetero­sexu­al­ity.

Take for example full body scan­ners at inter­na­tional airports and how they dispro­por­tion­ately affect partic­u­lar bodies, includ­ing people with disab­il­it­ies, gender­queer bodies, racial­ised groups or reli­gious minor­it­ies. To illus­trate how algorithms are by no means neut­ral we can also revisit the discus­sions of Google image search results for “unpro­fes­sional hair” (hint: black women with natural hair), “women” or “men” (hint: norm­at­ively pretty white people). Whether we argue that Google’s search algorithm is racist per se, or concede that it merely reflects the racism of wider soci­ety – the end result remains far from neut­ral.

As Conrad (2009) notes, predict­ive models fed by surveil­lance data neces­sar­ily repro­duce past patterns. They cannot take into effect­ive consid­er­a­tion random­ness, ‘noise’, muta­tion, parody, or disrup­tion unless those effects coalesce into another pattern. More gener­ally, the effects of surveil­lance on non-norm­at­ive bodies and margin­al­ised groups – partic­u­larly when read and sorted through poten­tially racist, hetero­sex­ist, or Islamo­phobic algorithms – warrant queer femin­ist inter­ven­tions.

More gener­ally, the effects of surveil­lance on non-norm­at­ive bodies and margin­al­ised groups – partic­u­larly when read and sorted through poten­tially racist, hetero­sex­ist, or Islamo­phobic algorithms – warrant queer femin­ist inter­ven­tions

3. Noth­ing to hide?

Despite many very convin­cing rebut­tals of this narrat­ive (for example here, here or here) as well as its attri­bu­tion to prom­in­ent Nazis, the lazy retort of “I have noth­ing to hide” in the face of mass surveil­lance has not gone out of style. My favour­ite response comes from Edward Snowden (pictured here) and makes abund­antly clear that this argu­ment is a fallacy for all.

 

Its femin­ist implic­a­tions, however, reach wider than imme­di­ately meets the eye. Governed by hetero­norm­at­ive insti­tu­tions within borders that come with their own racial­ised and sexu­al­ised tech­no­lo­gies of control, how much is safe to reveal and what must remain hidden is not equal for all.

Google CEO Eric Schmidt, exem­plary rich white man, has prom­in­ently expressed the opin­ion that whoever feels they have some­thing to hide should prob­ably not be doing it in the first place. Beyond well docu­mented concerns about invol­un­tary expos­ure of non-norm­at­ive sexu­al­ity and need for anonym­ity to main­tain so-called safe spaces online – racial­ised, reli­gious, and gender­queer minor­it­ies poten­tially risk much more than the aver­age white person (let alone Eric Schmidt) by reveal­ing everything.

Beauchamp for example discusses the implic­a­tions of “noth­ing to hide” for trans* people for whom ques­tions of stealth versus visib­il­ity take on multiple dimen­sions. While some nego­ti­ate a desire/need to remain hidden with medical and legal records that were never quite private, others’ compli­ant visib­il­ity risks compli­city with national secur­ity discourse around who can “pass” as “safe” citizen or trav­eler (white middle class, conclus­ively gendered, and defin­itely not Muslim) and whose devi­ance from the norm becomes subject to poli­cing. Those in less priv­ileged posi­tions receive the blame for their own exclu­sion and exploit­a­tion – if bad things happen to them as a result, it must be because they have some­thing to hide (Andre­jevic 2015:xvii).

4. Surveil­lance takes many forms

In addi­tion to prac­tices that are habitu­ally considered surveil­lance, for example CCTV and drone foot­age, wiretap­ping or PRISM, a femin­ist perspect­ive can draw atten­tion to a much wider range of de facto surveil­lance. For all the neces­sary and import­ant focus on data-based mass surveil­lance since the release of the Snowden files, it is import­ant to keep in mind that more mundane prac­tices need to remain part of the discus­sion – not least because they also leave data traces outside of our control.

it is import­ant to keep in mind that more mundane prac­tices need to remain part of the discus­sion – not least because they also leave data traces outside of our control



It is no coin­cid­ence that the first book on femin­ist surveil­lance stud­ies considers fertil­ity screen­ings, ultra­sound images, birth certi­fic­ates, surrog­acy blogs, police photos of domestic viol­ence and the likes as racial­ised, gendered, classed and sexu­al­ised tech­no­lo­gies of surveil­lance along­side those tradi­tion­ally considered surveil­lance proper. With a nod to bell hooks, the authors call the ways in which surveil­lance prac­tices play into the hands of priv­ilege “white suprem­acist capit­al­ist hetero­pat­ri­archal surveil­lance”.

 

Along­side work that firmly places race at the heart of surveil­lance stud­ies, femin­ist perspect­ives enrich and complic­ate the ways we think about the tech­no­lo­gies track­ing our every move as well as which tech­no­lo­gies we think about in the first place. Last but not least, my hope is that this broad­en­ing of the scope might further inspire a wider public to care about how vari­ous kinds of surveil­lance seep into every aspect of our lives.

With a nod to bell hooks, the authors call the ways in which surveil­lance prac­tices play into the hands of priv­ilege “white suprem­acist capit­al­ist hetero­pat­ri­archal surveil­lance”.

5. Read number 3 again, it’s import­ant!

 

 

Author: Nicole Shep­h­ard

Nicole Shep­h­ard is a femin­ist researcher and writer inter­ested in the gender and tech nexus, surveil­lance, inter­sec­tion­al­ity and digital activ­ism. She holds a PhD in Gender (LSE) and an MSc in Inter­na­tional Devel­op­ment (Univer­sity of Bris­tol). She tweets as @kilolo_.