How your smart home devices can be turned against you

They can make our lives easier and more conve­ni­ent, but can devi­ces such as smart light bulbs and voice-contro­lled assis­tants also be used against some­one as a form of domes­tic abuse?
 

For billi­ons of people around the world, life at home has taken on a new signi­fi­cance this year. Flats and houses have become work­pla­ces, gyms, scho­ols and living spaces all rolled into one by nati­o­nal lock­downs.

It has also meant that many of us are spen­ding more time than ever with the gadgets we have welco­med into our homes – so-called “smart” devi­ces connec­ted to the inter­net that can be contro­lled with our voices or via apps on our phones.

From virtual assis­tants like Amazon’s Alexa, Apple’s Siri and Google Home, to smart light bulbs, kett­les, secu­rity came­ras and ther­mos­tats, they are collec­ti­vely known as the Inter­net of Things (IoT). Many of our house­hold appli­an­ces now come embed­ded with sensors and the ability to connect to wire­less networks, allo­wing them to gather data about how we use them, and commu­ni­cate with other devi­ces in our homes.

In 2017, there were an esti­ma­ted 27 billion IoT connec­ted devi­ces and this network is expec­ted to grow by 12% every year to reach more than 125 billion devi­ces by 2030.

The hope is that smart devi­ces can save us time and effort in the home by helping us digi­tise and auto­mate our lives. It is hard not to enjoy the conve­ni­ence of reques­ting a world news update, turning lights on and off with a simple command, or having a ther­mos­tat that can learn by itself when to heat your rooms based on your daily move­ments.

They are desig­ned to make our lives more conve­ni­ent, save us time and keep us safe.

Take the inter­net-connec­ted video door­bells that many people now have beside their front door. They make it possi­ble to see who has come to call and even talk to them without having to open the door and risk expo­sure to the coro­na­vi­rus. Auto­ma­ted devi­ces inside the home, meanw­hile, reduce the risk of viral trans­mis­sion. Accor­ding to the global tech­no­logy market firm ABI Rese­arch, the sales of smart devi­ces is set to incre­ase by as much as 30% (compa­red to the same time last year) as a result of the coro­na­vi­rus outbreak. “A smar­ter home can be a safer home,” a rese­arch direc­tor at ABI Rese­arch recently said.

But there are some who fear smart devi­ces like these may actu­ally pose a risk to the very people who share their home with them – that these tools of conve­ni­ence are being turned into weapons of domes­tic abuse.

Tech­no­logy is provi­ding new oppor­tu­ni­ties for abusers to control, harass and stalk their victims

Although there are many ways abuse and control can mani­fest in house­holds, tech­no­logy is provi­ding new oppor­tu­ni­ties for abusers to control, harass and stalk their victims. The mobile phone, in parti­cu­lar, can provide a way of trac­king and moni­to­ring the acti­vity of a part­ner or a child without their consent or know­ledge. A 2018 study by rese­ar­chers at Cornell Tech in New York even found that the deve­lo­pers of apps desig­ned to track devi­ces seem to expect them to be used in this way. When they asked 11 compa­nies who had deve­lo­ped “child safety” or “find my phone” apps if their product could be used to “track my [part­ner’s] phone without them knowing”, eight replied to say they could.

With a growing number of devi­ces in our homes capa­ble of gathe­ring data about our move­ments and daily beha­vi­our, the Inter­net of Things has the poten­tial to trans­form how this sort of tech­no­logy-enabled abuse can take place. Inter­net-connec­ted video door­bells and came­ras make it possi­ble to watch what some­one is doing from anyw­here in the world. Sensors on doors can reveal when some­one leaves the house, while the use of lights with smart bulbs can show their move­ments between rooms.

Inter­net-connec­ted locks can restrict move­ments into certain rooms or even keep some­one from leaving their home. Voice-contro­lled virtual assis­tants can provide a detai­led break­down of ques­ti­ons it has been asked and search history, perso­nal data that can easily bring rela­ti­ons­hips into conflict.

These systems also tend to require an admi­nis­tra­tion account, which gives a single person in a house­hold a pass­word-protec­ted way to control the system. Put all these aspects toget­her it seems like smart homes are inad­ver­tently built to allow one person to control and moni­tor the life of anot­her.  

 

“No IoT deve­lo­per in Sili­con Valley builds the system thin­king about the misuse of those tech­no­lo­gies, ” says Leonie Tanc­zer, lectu­rer and lead inves­ti­ga­tor of the Gender and IoT project at Univer­sity College London, which has been looking at ways smart devi­ces in our homes can be turned into tools of domes­tic abuse. “They built them on the premise of a conven­ti­o­nal family [and] just assume that anyone who coha­bits a space is happy with [their] data being collec­ted.”

But that’s not always the case. In 2018, one of the first known court cases for IoT-rela­ted abuse led to an 11-month prison sentence. Ross Cairns was found guilty of eaves­drop­ping on his estran­ged wife through the microp­hone on a wall-moun­ted tablet used to control the heating and lights in their home. Hearing her say that she no longer loved him, he arri­ved at the doors­tep of the home they once shared to confront her. “Oh, you don’t love me anymore?” he repor­tedly said.

The situ­a­tion quickly esca­la­ted. Cairns pushed his wife in front of their two chil­dren, spat on the winds­creen of her car, and insul­ted her.

This case is an unfor­tu­nate snaps­hot of many cases of domes­tic abuse. Although histo­ri­cally defi­ned by physi­cal acts of violence, our unders­tan­ding of domes­tic abuse is rapidly chan­ging.

“The outda­ted percep­tion about violent crime, ranging from common assault through to more seri­ous offen­ces, does not unders­tand the true nature of domes­tic abuse, ” said Robert Buck­land, the UK’s secre­tary of state for justice, while discus­sing the country’s recent domes­tic abuse bill at the House of Commons. “It igno­res the insi­di­ous, contro­lling or coer­cive beha­vi­our, and the psycho­lo­gi­cal abuse that, bit by bit, chan­ges what may start as a loving and equal rela­ti­ons­hip into one that is comple­tely unequal and contro­lling, where, without the victim reali­sing it, they are turned into some­body who is being abused.”

Like a virus, domes­tic abuse also spre­ads through close contact

Tech abuse doesn’t start and end with loca­tion trac­king. When smart home tech­no­logy in a dwelling is contro­lled by just one person, it can strip control from the others living there. Survei­llance can easily deve­lop into active stal­king, and what was once invi­si­ble beco­mes a tangi­ble feeling of threat or a physi­cal confron­ta­tion. Too often, heated moments can esca­late into violence.

While domes­tic abuse can affect anyone, it dispro­por­ti­o­na­tely affects women. Around the world, roughly a third of women have expe­ri­en­ced some form of physi­cal or sexual abuse from their inti­mate part­ner. Such acts leave a wake of depres­sion, abor­tion, lower birth weights in chil­dren, and a higher risk of HIV. In some cases, abuse can esca­late to murder. Around 38% of all women who are murde­red globally are killed at the hands of their current or former part­ner.

“Domes­tic violence is ende­mic, ” says Louise Howard, profes­sor of peri­na­tal psychi­atry at King’s College London. “It’s far, far more common than people realise”.

Like a virus, domes­tic abuse also spre­ads through close contact. A study from Leba­non publis­hed in 2017, for exam­ple, found that chil­dren who witnes­sed violence in their house­hold were three-times more likely to become perpe­tra­tors of inti­mate part­ner violence as adults.

Keeping laws in line with emer­ging tech­no­logy is challen­ging, but in many cases the people buying the tech­no­logy still have to figure out what is and isn’t accep­ta­ble beha­vi­our. “The tech­no­logy is chan­ging, soci­ety is trying to catch up and adapt, ” says Jason Nurse, a compu­ter scien­tist at the Univer­sity of Kent in England, who studies inside and outside thre­ats to cyber­se­cu­rity. “If some­one was in a room reading a book most people wouldn’t walk in and just turn off the lights and leave. Because, why would you do that? But in IoT inci­den­ces, if you can turn off some­o­ne’s smart light from a remote loca­tion, hey, that’s having a laugh.”

Over time, howe­ver, simi­larly unusual happe­nings might make you ques­tion the equa­lity of your rela­ti­ons­hip. “It often takes the accu­mu­la­tion of minor things until you realise, actu­ally, I’m in an unhe­althy rela­ti­ons­hip that’s not normal, ” Tanc­zer says. If one person in the house­hold is the account admi­nis­tra­tor of every­day items like the heating system, kettle and washing machine, they can each be used as tools of coer­cion and control. If these devi­ces stop func­ti­o­ning properly, only one person can put things straight again – a clas­sic method of enfor­cing depen­dence.

Consu­mers need to be more aware of what they are brin­ging into their homes when they buy devi­ces that work as part of the Inter­net of Things, argues Irina Brass, a lectu­rer in regu­la­tion, inno­va­tion, and public policy at Univer­sity College London, who studies emer­ging tech­no­lo­gies. “I think aware­ness is funda­men­tally lacking at the moment, ” she says. “Befo­re­hand, you would have one parti­cu­lar device or, say, maxi­mum, two devi­ces: your compu­ter and your phone that would be connec­ted to the inter­net. But now, incre­a­singly, you have a number of devi­ces in your home envi­ron­ment… the connec­ti­vity aspect of it is quite invi­si­ble [and] consu­mers are not aware of what this connec­ti­vity brings.”

Currently, smart devi­ces are actu­ally still rela­ti­vely dumb. Their core feature is their wire­less capa­bi­lity and connec­tion to other devi­ces in the home. The “smart” label is a misno­mer. But that is likely to change. As IoT systems make grea­ter use of arti­fi­cial inte­lli­gence and become more auto­ma­ted, we all need to teach oursel­ves about just how they can be used and misu­sed.

 

Author: Alex Riley

(Image credit: Getty Images)