Recently, Yvonne Meré, Deputy Mayor's Counsel for the City of Old Town, initiated a lawsuit against 16 websites that utilize artificial intelligence technology to create deeply fake pornographic content and manipulate photos of women and girls into nudity without their consent. This unprecedented legal action seeks to combat these harmful trends that are growing among youth, especially those who utilize "Stripping application" to manipulate their classmates' pictures of teenage boys.
Source: The image is generated by AI, and the image is authorized by Midjourney
According to the New York Times, the 16 indicted websites were accessed a total of 200 million times in the first six months of the year. The companies responsible for the sites are located in California, New Mexico, the United Kingdom and Estonia. Reporters who tried to contact representatives of the sites either did not respond or no one was available for an interview. Some of the sites even use the "Want her to strip?" statements to advertise their services, while others directly encourage users to obtain nude photos of women through these sites.
It's worth noting that these sites usually offer free initial images but charge for subsequent processing, with payment options including cryptocurrency or credit cards. The deep faking technique used relies on AI models trained with real pornographic images and child abuse images to generate realistic-looking nude photos.San Francisco, CaliforniaCity Attorney David Chiu emphasized that the penalties for those responsible are almost negligible, noting that once an image has been distributed, it becomes quite difficult to locate the original site, which complicates the victim's pursuit of legal redress.
Sara Eisenberg, who heads a legal unit that focuses on critical social issues, points out that educating young people about the safe use of technology is not enough. Any photo can be processed without authorization, and traditional protections are no longer effective, Eisenberg says: "Even if kids are skilled in the use of the Internet and social media, there's still nothing to stop someone from using these sites to do extremely bad things."
The lawsuit seeks not only to shut down the sites, but also to permanently enjoin them from continuing to create deeply falsified pornographic content, as well as civil penalties and attorney's fees. The lawsuit alleges that the sites violate state and local revenge porn laws, child pornography laws, and California's unfair competition law, which prohibits illegal and unfair business practices.
Meré said that after reading about the dangers of deeply faked images in the New York Times, she immediately contacted Eisenberg and sought the support of Chiu to formulate a lawsuit.Chiu mentioned that deeply faked nude photographs of everyone from Taylor Swift to the average high school student are commonplace, and that there is little or no punishment for them. Experts warn that deeply faked pornography has serious implications for victims' mental health, reputations, and college and job prospects.
While Chiu recognizes that this approach may result in a "gopher" of new sites, they hope to bring more new sites to prosecution as the problem continues to evolve. As the center of the AI industry, San Francisco is the right place for this legal battle, and Chiu points out that while the contributions of the AI industry have been positive, the emergence of deeply fake pornography is the "dark side" that needs to be addressed.