- The number of visitors to AI apps and websites that create fake nude images is increasing.
- Analytics firm Graphica found that a group of 34 such platforms attracted 24 million visitors in September.
- The platform uses AI to make photos of clothed women appear naked.
Data showed that the number of users of an AI app that can “undress” women in photos is increasing.
Such “undressing” sites and apps allow users to upload images of clothed people. A fake nude image of them is then generated using AI.
Social media analytics firm Graphica said in a December report that 34 website groups attracted more than 24 million unique visitors in September.obvious photos” The report cites data from traffic analysis site Similarweb.
The report also said that on platforms such as X and Reddit, the number of referral link spams for these sites and apps has increased by more than 2000% since the beginning of this year.
A further 53 Telegram groups used to access these services have at least 1 million users, according to the report.
“The increased visibility and accessibility of these services could lead to further examples of online harm, including the creation and dissemination of non-consensual nude images and targeted harassment,” the report’s researchers said. “I rate it as having very high quality.” campaign, sextortionand the generation Content of child sexual abuse. ”
App to create deep fake nude, images in which people are digitally altered to appear nude in photographs, have been around for several years. These are usually created without the subject’s consent and have previously targeted celebrities and online celebrities.
February, Twitch streamer QTCinderella Became a victim of deepfake porn A video of her is circulating on the internet.
A day after a screenshot of the video was shared online, the streamer wrote in an X post, “Since seeing that photo, I’ve experienced body dysmorphia and it’s ruined me.”
She added: “This is not just a simple matter of being compromised. It’s much more than that.”
In September, fake nude images of more than 20 girls from several schools in Spain were circulated. El PaĂs said the photos were processed by an AI application.
Watch now: Popular videos from Insider Inc.
Loading…