They can hear you: 6 ways tech is listening to you

Voice recog­ni­tion tech­no­logy often viola­tes human rights, and it’s popping up more and more. Recently we’ve called out Spotify for deve­lo­ping voice recog­ni­tion tech that claims to be able to detect gender and emoti­o­nal state, among other things.

But it’s not just Spotify. Some of the most power­ful compa­nies in the world are deploying simi­lar abusive tech because harves­ting data about you is profi­ta­ble. The market for voice recog­ni­tion is growing, expec­ted to be worth a whop­ping $26.8 billion by 2025.

This is not an exhaus­tive list, but below are some dange­rous exam­ples:

 

DANGE­ROUS EXAM­PLES OF HOW TECH COMPA­NIES CAN LISTEN TO YOU

 

McDo­nald’s: recog­ni­zing your voice at drive-thrus

McDo­nald’s has been testing voice recog­ni­tion tech in 10 drive-thru loca­ti­ons in and around Chicago. The company says it uses the tech to iden­tify your “age, gender, accent, nati­o­na­lity, and nati­o­nal origin.” In June, a custo­mer at an Illi­nois McDo­nald’s filed a lawsuit, saying that it viola­ted state law by using voice recog­ni­tion to take his order without consent.
McDo­nald’s: recog­ni­zing your voice at drive-thrus

×<MEDIA>@https://video.jifo.co

Amazon Halo Wrist­band: moni­to­ring the tone of your voice to infer how you feel

Through its Halo Wrist­band products, Amazon claims it can detect “posi­ti­vity” and “energy” in your voice to help you improve your commu­ni­ca­tion skills. Creepy much? Tone poli­cing is here.

 

×<MEDIA>@https://video.jifo.co

TikTok: collec­ting your “voice­prints”

In June, the popu­lar social media app TikTok upda­ted its privacy policy to let users know it may collect “voice­prints” to help the company with its “demo­grap­hic clas­si­fi­ca­tion, ” which could include everyt­hing from race to income to gender to mari­tal status.
Image
 
 

Call Centers: know if you’re agita­ted

Some call centers are using AI to try to detect your emoti­ons, and may connect you to a repre­sen­ta­tive best equip­ped to sell you a certain product based on your “angry” feelings.

Samsung: your smart fridge is recor­ding you

In some of its smart fridge models, Samsung likely uses recor­dings of your voice (that it may store on its servers) to sell you more products.
Image
Image
 
 

Hire­Vue: when your voice deci­des if you get a job

 

Follo­wing an algo­rith­mic audit, and intense public pres­sure from civil soci­ety, AI hiring company Hire­Vue finally drop­ped facial recog­ni­tion from its soft­ware. But, while this means they won’t use facial recog­ni­tion to decide if you get hired, nothing is stop­ping them from using their voice recog­ni­tion soft­ware to make equally proble­ma­tic infe­ren­ces about your suita­bi­lity based on how you talk.

×<MEDIA>@https://video.jifo.co

 

All of this is alar­ming, unwar­ran­ted, and, in certain juris­dic­ti­ons, ille­gal. Using voice recog­ni­tion tech to make infe­ren­ces about us inva­des our private lives and rein­for­ces harm­ful, regres­sive stere­oty­pes.

We’re keeping an eye on this emer­ging tech, and calling out compa­nies to hold them accoun­ta­ble. You deserve respect, not exploi­ta­tion.

And we’re not the only ones paying atten­tion. With our part­ners from around the world, we laun­ched a campaign to ban biome­tric survei­llance and a call to outlaw auto­ma­ted recog­ni­tion of gender and sexual orien­ta­tion.

Spread the word. RT us about the proli­fe­ra­tion of this dange­rous tech here.

We shouldn’t have to worry about our smart refri­ge­ra­tors, voice assis­tants, and apps with a microp­hone liste­ning to us, profi­ling us, and trying to read our minds.