The professor trying to protect our private thoughts from technology

Imatge
Àmbits Temàtics

Prof Nita Fara­hany argues in her new book, The Battle for Your Brain, that intru­si­ons into the mind are so close that lawma­kers should enact protec­ti­ons.

Private thoughts may not be private for much longer, heral­ding a night­ma­rish world where poli­ti­cal views, thoughts, stray obses­si­ons and feelings could be inter­ro­ga­ted and punis­hed all thanks to advan­ces in neuro­te­ch­no­logy.

Or at least that is what one of the world’s leading legal ethi­cists of neuros­ci­ence beli­e­ves.

In a new book, The Battle for Your Brain, Duke Univer­sity bios­ci­ence profes­sor Nita Fara­hany argues that such intru­si­ons into the human mind by tech­no­logy are so close that a public discus­sion is long over­due and lawma­kers should imme­di­a­tely esta­blish brain protec­ti­ons as it would for any other area of perso­nal liberty.

Advan­ces in hacking and trac­king thoughts, with Orwe­llian fears of mind control running just below the surface, is the subject of Fara­hany’s scho­lars­hip along­side urgent calls for legis­la­tive guaran­tees to thought privacy, inclu­ding free­doms from “cogni­tive finger­prin­ting”, that lie within an area of ethics broadly termed “cogni­tive liberty”.

Certainly the field is advan­cing rapidly. The recent launch of ChatGPT and other AI tech inno­va­ti­ons showed that some aspects of simu­la­tion of thought, termed machine lear­ning, are alre­ady here. It’s been widely noted also that Elon Musk’s Neura­link and Mark Zucker­berg’s Meta are working on brain inter­fa­ces that can read thoughts directly. A new field of cogni­tive-enhan­cing drugs – called Nootro­pics – are being deve­lo­ped. Tech­no­logy that allows people expe­ri­en­cing paraly­sis to control an arti­fi­cial limb or write text on a screen just by thin­king it are in the works.

But aside from the many bene­fits, there are clear thre­ats around poli­ti­cal indoc­tri­na­tion and inter­fe­rence, work­place or police survei­llance, brain finger­prin­ting, the right to have thoughts, good or bad, the impli­ca­ti­ons for the role of “intent” in the justice system, and so on.

Fara­hany, who served on Barack Obama’s commis­sion for the study of bioet­hi­cal issues, beli­e­ves that advan­ces in neuro­te­ch­no­logy mean that intru­si­ons through the door of brain privacy, whet­her by way of mili­tary programs or by way of well-funded rese­arch labs at big tech compa­nies, are at hand via brain-to-compu­ter inno­va­ti­ons like weara­ble tech.

“All of the major tech compa­nies have massive invest­ments in multi­func­ti­o­nal devi­ces that have brain sensors in them, ” Fara­hany said. “Neural sensors will become part of our every­day tech­no­logy and a part of how we inter­act with that tech­no­logy.”

Coupled with advan­ces in science aimed at deco­ding and rewri­ting of brain func­ti­ons are wides­pread and pose a discer­ni­ble risk, Fara­hany argues, and one that requi­res urgent action to bring under agreed controls.

“We have a moment to get this right before that happens, both by beco­ming aware of what’s happe­ning and by making criti­cal choi­ces we need to make now to decide how we use the tech­no­logy in ways that are good and not misu­sed or oppres­sive.”

The brain, Fara­hany warns, is the one space we still have for repri­eve and privacy, and where people can culti­vate a true sense of self and where they can keep how they’re feeling and their reac­ti­ons to them­sel­ves. “In the very near future that won’t be possi­ble, ” she said.

I wrote this book with neuro­te­ch­no­logy at the fore­front as a wake-up call, but not just neuro­te­ch­no­logy but all the ways out brains can be hacked and trac­ked

Nita Fara­hany

In a sense, we alre­ady use tech­no­logy to trans­late our thoughts and help our minds. Social medi­a’s ability to read minds is alre­ady offe­red, free of charge, through parti­ci­pa­tion with like and dislike func­ti­ons, predic­tive algo­rithms, predic­tive text and so on.

But advan­ces in neuro­te­ch­no­lo­gies – exploi­ting a direct connec­tion to the brain – would offer more precise and there­fore poten­tial dange­rous forays into a hitherto private realm.

“I wrote this book with neuro­te­ch­no­logy at the fore­front as a wake-up call, but not just neuro­te­ch­no­logy but all the ways out brains can be hacked and trac­ked and alre­ady are being hacked and trac­ked, ” Fara­hany said.

Concerns about mili­tary-focu­sed neuros­ci­ence, called the sixth dimen­sion of warfare, are not in them­sel­ves new.

The Defense Advan­ced Rese­arch Projects Agency (Darpa) has been funding brain rese­arch since the 1970s. In 2001, the mili­tary umbre­lla laun­ched a program to “deve­lop tech­no­lo­gies to augment warfigh­ters”.

Fran­çois du Cluzel, a project mana­ger at Nato Act Inno­va­tion Hub, issued a report in Novem­ber 2020 entit­led Cogni­tive Warfare that, it said, “is not limi­ted to the mili­tary or insti­tu­ti­o­nal world. Since the early 1990s, this capa­bi­lity has tended to be applied to the poli­ti­cal, econo­mic, cultu­ral and soci­e­tal fields.”

The US govern­ment has black­lis­ted Chinese insti­tu­tes and firms it beli­e­ves to be working on dange­rous “biote­ch­no­logy proces­ses to support Chinese mili­tary end uses”, inclu­ding “purpor­ted brain-control weaponry”.

In late 2021, the commerce depart­ment added 34 China-based enti­ties to a black­list, citing some for invol­ve­ment in the crea­tion of biote­ch­no­logy that inclu­des “purpor­ted brain-control weaponry” and of “acting contrary to the foreign policy or nati­o­nal secu­rity inter­ests” of the US.

Nathan Beau­champ-Musta­faga, a policy analyst at the Rand Corpo­ra­tion and author the China Brief, has warned of an “evolu­tion in warfare, moving from the natu­ral and mate­rial domains – land, mari­time, air and elec­tro­mag­ne­tic – into the realm of the human mind”.

Fara­hany argues that soci­e­ties need to go further than addres­sing cogni­tive warfare or banning TikTok. Legis­la­tion to esta­blish brain rights or cogni­tive liber­ties are needed along­side raising aware­ness of risks of intru­sion posed by digi­tal plat­forms inte­gra­ted with advan­ces in neuros­ci­ence.

“Neuro rights” laws, which include protec­ti­ons on the use of biome­tric data in health and legal settings, are alre­ady being drawn up. Two years ago, Chile became the first nation to add arti­cles into its cons­ti­tu­tion to expli­citly address the challen­ges of emer­ging neuro­te­ch­no­lo­gies. The US state of Wiscon­sin has also passed laws on the collec­tion of biome­tric data regar­ding the brain.

Most legal protec­ti­ons are around the disclo­sure of the collec­tion of brain data, not around neuro rights them­sel­ves.

“There’s no compre­hen­sive right to cogni­tive liberty, as I define it, that applies to far more than neuro­te­ch­no­lo­gies but applies to self-deter­mi­na­tion over our brains and mental expe­ri­en­ces, which applies to so many of the digi­tal tech­no­lo­gies we’re appro­a­ching today, ” Fara­hany said.

Or, as Fara­hany writes in her book: “Will George Orwell’s dysto­pian vision of thought­crime become a modern-day reality?”

The answer could be yes, no or maybe, but none of it preclu­des an urgent need for formal brain protec­ti­ons that legis­la­tors or commer­cial inter­ests may not be incli­ned to esta­blish, Fara­hany beli­e­ves.

She said: “Cogni­tive liberty is part of a much broa­der conver­sa­tion that I beli­eve is incre­dibly urgent given everyt­hing that is alre­ady happe­ning, and the incre­a­singly preci­sion with which it’s going to happen, within neuro­te­ch­no­logy.”

 

Photo: Prof Nita Fara­hany wears a mental stimu­la­tor at Duke Univer­sity in Durham, North Caro­lina, in Febru­ary. Photo­graph: Justin Cook/The Obser­ver