becauseArtificial Intelligence TechnologyThe ability of apps and websites to utilize their ability to "strip" photos of women is increasing, researchers found, with 24 million people visiting such sites in just one month in September 2023, according to social network analytics firm Graphika.
Graphika's data shows that many of these "undressing" services utilize popular social networks for marketing purposes, with the number of links advertising these apps on social media growing by more than 24,00% since the beginning of the year, including on platforms such as X and Reddit. The services use artificial intelligence to recreate images that show people in the nude, and many of them are only available to women.
Source Note: The image is generated by AI, and the image is authorized by Midjourney
These applications are part of a worrying trend in deepfake pornography that stems from the development of artificial intelligence technology, namely fictional media produced through AI, known as deepfake pornography. Because these images often come from social media and are distributed without the subject's knowledge, their distribution involves serious legal and ethical hurdles.
One of the ad images posted on X used language suggesting that customers could create nude images and send them to digitally "undressed" characters, raising harassment issues. Another app purchased sponsored content on Google's YouTube and used the word "nudify" in searches.firstAppearance.
In response, a Google spokesperson said the company does not allow ads that contain gender-based violence, and has reviewed the ads in question and removed those that violate the policy. X and Reddit, meanwhile, did not respond to requests for comment.
Non-consensual deepfake pornography targeting public figures on the web has long been a persistent problem on the Internet, but privacy experts are increasingly concerned that advances in AI technology are making deepfake software easier and more effective.
"We're seeing more and more ordinary people as well as ordinary targets falling victim to this type of behavior," said Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, "and this is especially true among high school and college students."
Many victims never discover these images, and even if they do, they may have difficulty getting law enforcement agencies to investigate or find the funds to take legal action.
Currently, there is no federal law prohibiting the production of deepfake pornography, although the United States Government prohibits the production of such images of children. Last November, a child psychiatrist in North Carolina was sentenced to 40 years in prison for using the "Strip" app to deepfake patient photographs, in accordance with a law banning the production of child deepfake pornography.firstProsecution.
In an effort to curb content, TikTok has blocked the keyword "undress" in connection with the service and warned any user searching for the term that it "may be associated with behavior or content that violates our guidelines." Meta Platforms Inc. has also begun blocking keywords related to searches for the "undress" app. A TikTok representative declined to comment on the move, as did a Meta Platforms spokesperson.