A Study Reveals that More than 24 Million Individuals Visit Websites Enabling AI-Based Undressing of Women in Photos

Technology

Social network analysis company Graphika uncovered a concerning trend with over 24 million visits to undressing websites in September alone, underscoring a disturbing rise in non-consensual pornography propelled by advancements in artificial intelligence. Here are the specifics

The surge in the popularity of applications and websites employing artificial intelligence to digitally undress women in photos has raised concerns among researchers and privacy advocates, as reported by Bloomberg. Graphika, a social network analysis company, disclosed that an astonishing 24 million individuals visited these undressing platforms in September alone, signaling a troubling increase in non-consensual pornography propelled by advancements in artificial intelligence. These services, often referred to as “nudify” apps, have leveraged popular social networks for marketing, with links advertising undressing applications soaring by more than 2,400 percent on platforms like X and Reddit since the start of the year. The use of AI to digitally undress individuals, predominantly women, poses serious legal and ethical challenges, given that the images are frequently sourced from social media without the subject’s consent or awareness.The alarming trend extends to potential harassment, with some advertisements suggesting customers could create nude images and send them to the digitally undressed person. Google has articulated its policy against sexually explicit content in ads and is actively removing violative material, but platforms like X and Reddit have yet to respond to inquiries.

Privacy experts are sounding the alarm about the growing accessibility of deepfake pornography facilitated by advancements in AI technology. Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, notes a shift towards ordinary people using these technologies on everyday targets, including high school and college students. Many victims may remain unaware of these manipulated images, and those who do face challenges in seeking law enforcement intervention or pursuing legal action. Despite mounting concerns, there is currently no federal law in the United States explicitly prohibiting the creation of deepfake pornography. A recent case in North Carolina, where a child psychiatrist was sentenced to 40 years for using undressing apps on patient photos, marks the first prosecution under a law banning the deepfake generation of child sexual abuse material.

In response to this alarming trend, TikTok and Meta Platforms Inc. have taken measures to block keywords associated with these undressing apps. TikTok warns users that the term “undress” may be linked to content violating its guidelines, while Meta Platforms Inc. declined to provide further comments on its actions. As technology evolves, the ethical and legal challenges posed by deepfake pornography underscore the urgent need for comprehensive regulations to protect individuals from the non-consensual and harmful use of AI-generated content.

Leave a Reply

Your email address will not be published. Required fields are marked *