Apps and websites that use artificial intelligence to undress women in photos are rapidly gaining popularity, researchers say.
According to a study by social network analysis firm Graphika, 24 million people visited undressing websites in September alone.
The rise in popularity corresponds to the release of several open source diffusion models, or artificial intelligence, that can create images that are much better than those created years ago, Grafika said. Being open source, the models used by app developers are available for free.
“We can actually create something that looks real,” said Santiago Lakatos, an analyst at Graphica, noting that previous deepfakes were often blurry.
One image posted on X promoting a stripping app uses language that suggests customers can create nude images and send them to a digitally undressed person, inciting harassment. I was there. Meanwhile, one of the apps pays for sponsored content on Google’s YouTube, showing up first when you search for the word “nudify.”
A Google spokesperson said the company does not allow ads that “contain sexually explicit content.”
The company said it was “investigating the ads in question and removing ads that violate our policies.”
A Reddit spokesperson said the site prohibits the non-consensual sharing of false and sexually explicit content and has banned several domains following an investigation. X did not respond to requests for comment.
In addition to increased traffic, some services claim on their websites that they charge $9.99 per month, attracting more customers. āThey do a lot of business,ā Lakatos said. Regarding one of his undressing apps, he said, “If you take them at their word, the website advertises that he has over 1,000 users per day.” I did.
Non-consensual pornography of public figures has long plagued the internet, but privacy experts are increasingly concerned that advances in AI technology are making deepfake software easier and more effective. ing.
“More and more, we’re seeing ordinary people doing this with ordinary targets,” said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation. “We’re seeing it among high school and college students.”
Galperin said many victims never know the images exist, but those who do may have trouble finding the funds to contact law enforcement or file a lawsuit. there is.
There is currently no federal law prohibiting the creation of deepfake pornography, but the U.S. government prohibits the production of these types of images of minors. In November, a North Carolina child psychiatrist was sentenced to 40 years in prison for using an undressing app to photograph patients, the first indictment under a law that prohibits the production of deepfakes of child sexual abuse material. It became.