A shadowy AI service has transformed thousands of women’s photos into fake nudes: ‘Make fantasy a reality’

More than 100,000 photos of women have had their clot­hing remo­ved by the soft­ware, inclu­ding of girls youn­ger than 18.

An arti­fi­cial inte­lli­gence service freely avai­la­ble on the Web has been used to trans­form more than 100,000 women’s images into nude photos without the women’s know­ledge or consent, trig­ge­ring fears of a new wave of dama­ging “deep­fa­kes” that could be used for harass­ment or black­mail.

Users of the auto­ma­ted service can anony­mously submit a photo of a clot­hed woman and receive an alte­red version with the clot­hing remo­ved. The AI tech­no­logy, trai­ned on large data­ba­ses of actual nude photo­graphs, can gene­rate fakes with seemingly life­like accu­racy, matching skin tone and swap­ping in breasts and geni­ta­lia where clot­hes once were.

The women’s faces remain clearly visi­ble, and no labels are appen­ded to the images to mark them as fake. Some of the origi­nal images show girls youn­ger than 18.

The service, which allows people to place new orders through an auto­ma­ted “chat­bot” on the encryp­ted messa­ging app Tele­gram, was first disco­ve­red by rese­ar­chers at Sensity, an Amster­dam-based cyber­se­cu­rity start-up that shared its findings with The Washing­ton Post.

The chat­bot and seve­ral other affi­li­a­ted chan­nels have been used by more than 100,000 members world­wide, the rese­ar­chers found. In an inter­nal poll, the bot’s users said roughly 63 percent of the people they wanted to undress were girls or women they knew from real life.

Gior­gio Patrini, the group’s chief execu­tive, said the chat­bot signals a dark shift in how the tech­no­logy is used, from faking images of cele­bri­ties and well-known figu­res to targe­ting unsus­pec­ting women far from the public eye.

“The fact is that now every one of us, just by having a social media account and posting photos of oursel­ves and our lives publicly, we are under threat, ” Patrini said in an inter­view. “Simply having an online persona makes us vulne­ra­ble to this kind of attack.”

The chat­bot’s growth signals just how quickly the tech­no­logy behind fake imagery has become ubiqui­tous.

Ten years ago, crea­ting a simi­larly convin­cing fake would have taken advan­ced photo-editing tools and consi­de­ra­ble skill. Even a few years ago, crea­ting a life­like fake nude using AI tech­no­logy — such as the “deep­fake” porn videos in which female cele­bri­ties, jour­na­lists and other women have been supe­rim­po­sed into sex scenes — requi­red large amounts of image data and compu­ting resour­ces.

But with the chat­bot, crea­ting a nude rende­ring of some­o­ne’s body is as easy as sending an image from your phone. The service also assem­bles all of those newly gene­ra­ted fake nudes into photo galle­ries that are upda­ted daily; more than 25,000 accounts have alre­ady subs­cri­bed for daily upda­tes.

The bot’s biggest user base is in Russia, accor­ding to inter­nal surveys, though members also origi­nate from the United States and across Europe, Asia and South America.

New users can request some of their first fake nudes for free but are encou­ra­ged to pay for further use. A begin­ners’ rate offers new users 100 fake photos over seven days at a price of 100 Russian rubles, or about $1.29. “Paid premium” members can request fake nude photos be crea­ted without a water­mark and hidden from the public chan­nel.

The chat­bot’s admi­nis­tra­tor, whom The Post inter­vi­e­wed Monday through messa­ges on Tele­gram, decli­ned to give their name but defen­ded the tool as a harm­less form of sexual voyeu­rism and said its opera­tors take no respon­si­bi­lity for the women targe­ted by its user base. As an allu­sion to its boys-will-be-boys posture, the servi­ce’s logos feature a smiling man and a woman being ogled by X-ray glas­ses.

But tech­no­logy and legal experts argue that the soft­ware is weapo­ni­zing women’s own photo­graphs against them, sexu­a­li­zing women for a face­less group of stran­gers and presa­ging a new age of fabri­ca­ted revenge porn.

Some tech giants have taken a stand against deep­fa­kes and other “mani­pu­la­ted media.” But because the system’s source code has alre­ady been widely shared by online copy­cats, the experts see no clear way to stop simi­lar soft­ware from crea­ting, hosting and sharing fake nude images across the unre­gu­la­ted Web.

Some of the targe­ted women are popu­lar enter­tai­ners or social media influ­en­cers with siza­ble audi­en­ces. But many of those seen in publicly avai­la­ble photos produ­ced by the bot are from every­day workers, college students and other women, often taken from their selfies or social media accounts on sites like TikTok and Insta­gram.

Dani­e­lle Citron, a Boston Univer­sity law profes­sor who rese­ar­ches the online erosion of “inti­mate privacy, ” said she has inter­vi­e­wed dozens of women about the expe­ri­ence of having real or manu­fac­tu­red nude images shared online. Many said they felt deep anguish over how their images had been seen and saved by online stran­gers — and, poten­ti­ally, their co-workers and class­ma­tes.

“You’ve taken my iden­tity and you’ve turned it into porn . . . That feels so visce­ral, harm­ful, wrong, ” Citron said. “Your body is being taken and undres­sed without your permis­sion, and there’s docu­men­tary evidence of it. . . . Inte­llec­tu­ally, [you] know it hasn’t happe­ned. But when [you] see it, it feels as if it has, and you know others won’t always know” it’s fake.

“The vulne­ra­bi­lity that crea­tes in how you feel about your safety in the world: Once you rip that from some­body, it’s very hard to take back, ” she added.

The bot gives users advice on submit­ting requests, recom­men­ding that the origi­nal photos be cente­red at the women’s breasts and show them in under­wear or a swim­suit for best results. But many of the images show women in unre­ve­a­ling school attire or every­day clot­hes, like a T-shirt and jeans. At least one woman was pictu­red in a wedding dress.

One young woman had multi­ple photos of her submit­ted to the service, some of which inclu­ded a fake bikini top crudely inser­ted on top of her normal clot­hes — likely an attempt to improve the bot’s perfor­mance.

The auto­ma­ted service, howe­ver, only works on women: Submit an image of a man — or an inani­mate object — and it will be trans­for­med to include breasts and female geni­ta­lia. (In one submit­ted image of a cat’s face, its eyes were repla­ced with what appe­a­red to be nipples.)

The bot’s admi­nis­tra­tor, spea­king in Russian, told The Post in a private chat on Monday that they didn’t take respon­si­bi­lity for how reques­ters used the soft­ware, which they argued was freely avai­la­ble anyway. “If a person wants to poison anot­her, he’ll do this without us, and he’ll be the one respon­si­ble for his acti­ons, ” the admi­nis­tra­tor wrote.

The Sensity rese­ar­chers coun­ted more than 104,000 images of women alte­red to appear nude and shared in public chan­nels. A website for the service suggests that number is far higher, with 687,322 “girls nuded” and 83,364 “men enjoyed.” But the admi­nis­tra­tor said that number was random and used only for adver­ti­sing, because they do not keep statis­tics of proces­sed photos.

The bot’s rules say it does not allow nudes to be made of unde­rage girls. But the servi­ce’s publicly visi­ble collec­ti­ons feature teenage girls, inclu­ding a popu­lar TikTok perso­na­lity who is 16 years old.

The admi­nis­tra­tor said the system was desig­ned merely to fulfill users’ fanta­sies and that everyone who would see the images would realize they were fakes.

“You greatly exag­ge­rate the real­ness, ” the admi­nis­tra­tor said. “Each photo shows a lot of pixels when zoomed-in. All it allows you to do is to make fantasy a reality, visu­a­lize and unders­tand that it’s not real.”

The admi­nis­tra­tor also said the service had not “recei­ved a single compla­int from a girl during the entire period of our work, ” and attemp­ted to shift the blame onto victims of the fakes for posting their images online.

"To work with the neural network, you need a photo in a swim­suit or with a mini­mum amount of clot­hing. A girl who puts a photo in a swim­suit on the Inter­net for everyone to see — for what purpose does (she do) this?” the admi­nis­tra­tor wrote. “90% of these girls post such photos in order to attract atten­tion, focu­sing on sexu­a­lity.”

Follo­wing ques­ti­ons from a Post repor­ter, howe­ver, the admi­nis­tra­tor said they had disa­bled the bot’s chat and gallery featu­res because of a “lot of complaints about the content.” The service for crea­ting new images, and previ­ously gene­ra­ted images, remai­ned online.

Repre­sen­ta­ti­ves for Tele­gram, which offers end-to-end encryp­tion and private chat func­ti­ons, did not respond Monday to requests for comment.

Britt Paris, an assis­tant profes­sor at Rutgers Univer­sity who has rese­ar­ched deep­fa­kes, said mani­pu­la­tors have often charac­te­ri­zed their work as expe­ri­men­ting with new tech­no­logy in a light­he­ar­ted way. But that defense, she said, conve­ni­ently igno­res how misogy­nis­tic and devas­ta­ting the images can be.

“These amateur commu­ni­ties online always talk about it in terms of: ‘We’re just . . . playing around with images of naked chicks for fun, ’ ” Paris said. “But that glos­ses over this whole problem that, for the people who are targe­ted with this, it can disrupt their lives in a lot of really dama­ging ways.”

The bot was built on open-source “image-to-image trans­la­tion” soft­ware, known as pix2 pix, first reve­a­led in 2018 by AI rese­ar­chers at the Univer­sity of Cali­for­nia at Berke­ley. By feeding the system a huge amount of real images, it can recog­nize visual patterns and, in turn, create its own fakes, trans­for­ming photos of lands­ca­pes from daytime to night, or into full color from black-and-white.

The soft­ware relies on an AI breakth­rough known as gene­ra­tive adver­sa­rial networks, or GANs, that has explo­ded in popu­la­rity in recent years for its ability to process mounds of data and gene­rate life­like videos, images and passa­ges of text.

The rese­ar­chers behind pix2 pix cele­bra­ted its poten­tial bene­fits for artists and visual crea­tors. But last year, an anony­mous program­mer trai­ned the underlying soft­ware on thou­sands of photos of naked women, effec­ti­vely teaching the system to trans­form women from clot­hed to nude.

After the tech blog Mother­bo­ard wrote last year about the app, called Deep­Nude, the deve­lo­per respon­ded to the online back­lash by taking the free-to-down­load app offline, saying, “The proba­bi­lity that people will misuse it is too high."

The deep-lear­ning pioneer Andrew Ng last year called Deep­Nude “one of the most disgus­ting appli­ca­ti­ons of AI, ” adding: “To the AI Commu­nity: You have super­po­wers, and what you build matters. Please use your powers on worthy projects that move the world forward.”

But the exis­tence of the chat­bot shows how it will be virtu­ally impos­si­ble to eradi­cate the soft­ware outright. The origi­nal app’s source code has been saved and widely distri­bu­ted online, inclu­ding in for-profit websi­tes that offer to gene­rate images in exchange for a small fee.

Hany Farid, a compu­ter scien­tist at UC-Berke­ley who speci­a­li­zes in digi­tal-image foren­sics and was not invol­ved in the origi­nal pix2 pix rese­arch, said the fake-nude system also high­lights how the male homo­ge­neity of AI rese­arch has often left women to deal with its darker side.

AI rese­ar­chers, he said, have long embra­ced a naive techno-utopian world­view that is hard to justify anymore, by openly publis­hing unre­gu­la­ted tools without consi­de­ring how they could be misu­sed in the real world.

“It’s just anot­her way people have found to weapo­nize tech­no­logy against women. Once this stuff gets online, that’s it. Every poten­tial boyfri­end or girl­fri­end, your employer, your family, may end up seeing it, ” Farid said. “It’s awful, and women are getting the brunt of it.

“Would a lab not domi­na­ted by men have been so cava­lier and so care­less about the risks?” he added. “Would [AI rese­ar­chers] be so cava­lier if that bad [stuff] was happe­ning to them, as oppo­sed to some woman down the street?”

That problem is alre­ady a reality for many women around the world. One woman targe­ted by the bot, an art student in Russia who asked to remain anony­mous because she did not want to get invol­ved with these “stupid people, ” had a photo of herself in a tank top taken from her Insta­gram account and trans­for­med into a fake nude.

In an inter­view, she compa­red the fake to some­one smea­ring her name but said she was grate­ful that enough people knew her to realize it probably wasn’t real.

“The scam­mers who do this kind of filth will not succeed, ” she said. “I beli­eve in karma, and what comes around for them won’t be any clea­ner than their own acti­ons."

Isabe­lle Khurs­hudyan and Will Englund contri­bu­ted to this report.

Drew Harwell

Drew Harwell is a tech­no­logy repor­ter for The Washing­ton Post cove­ring arti­fi­cial inte­lli­gence and the algo­rithms chan­ging our lives. He joined The Post in 2014 and has cove­red nati­o­nal busi­ness and the Trump compa­nies