What is undressing AI? What is AI "one-click undressing"?

What is undressing AI? What is AI "one-click undressing"?

What is "undressing AI"?

Undress AIDescribes a tool that uses artificial intelligence to remove a person's clothing from an image.

While each app or website may work a little differently, they all offer similar services. Although the manipulated image doesn’t actually show the victim’s real nudity, it can hint at it.

Perpetrators using strip AI tools may keep these images for themselves, or may share them more widely. They may use these images for sexual coercion (sextortion), bullying/abuse, or as a form of revenge porn.

If someone uses this technology to "undress" children and adolescents, they will face additional harm. A report by the Internet Watch Foundation found more than 11,000 AI-generated potentially criminal images of children on a dark web forum dedicated to child sexual abuse material (CSAM). They assessed approximately 3,000 images as criminal images.

The IWF said it also found "many examples of AI-generated images that include known victims and famous children." Generative AI can only create convincing images if it learns from accurate source material. Essentially, AI tools that generate CSAM need to learn from real images of child abuse.

Risks to watch out for

Undressing AI tools use suggestive language to draw users in. Therefore, children are more likely to follow their curiosity based on this language.

Children and teenagers may not yet be aware of the law. Therefore, it may be difficult for them to distinguish harmful tools from those that promote harmless fun.

Inappropriate Content and Conduct

The curiosity and novelty of strip AI tools could expose children to inappropriate content. Because it doesn’t show “real” nude images, they may think it’s OK to use these tools. If they then share that image with a friend “for a laugh,” they could be breaking the law without even knowing it.

Without intervention from a parent or caregiver, they may continue with the behavior, even if it hurts others.

Privacy and security risks

Many legitimate generative AI tools require a fee or subscription to create images. So if a Deepnude site is free, it could produce low-quality images or have lax security. If a child uploads a dressed picture of themselves or a friend, the site or app could abuse it. This includes the Deepnudes it creates.

Children using these tools are unlikely to read the terms of service or privacy policy, so they run the risk of not understanding them.

Production of Child Sexual Abuse Material (CSAM)

The IWF also reports that “self-generated” CSAM circulating online increased by 4,17% from 2019 to 2022. Note that the term “self-generated” is not perfect because in most cases, abusers force children to create these images.

However, with the use of undressing AI, children could be unknowingly creating AI-generated CSAM. If they upload a dressed photo of themselves or another child, someone could “nude” that image and share it more widely.

Cyberbullying, abuse and harassment

Just like other types of deepfakes, people can use deepfake AI tools, or "deepnudes," to bully others. This could include claiming that a peer sent a nude of themselves when they didn't. Or, it could include using AI to create a nude with the characteristics of a bully and then mocking it.

It's important to remember that sharing nude photos of your peers is both illegal and abusive.

How common is "deep nude" technology?

Research shows that the use of such AI tools is increasing, especially when it comes to removing the clothes of female victims.

One striptease AI website said their technology “does not work with male subjects”. This is because they used images of women to train the tool, which is true for most such AI tools. Of the AI-generated CSAM investigated by the Internet Watch Foundation, 99.6% featured female children.

Graphika’s research highlights that referral link spam from strip AI services will increase by 2023% by 2000. The report also found that 34 of these providers received more than 240,000 unique visitors to their sites in a single month. They predict that “there will be more cyber harm incidents,” including sextortion and CSAM.

Offenders may continue to target girls and women more than boys and men, especially if the tools learn primarily from images of women.

What does UK law say?

Until recently, those who created explicit deepfake images were not breaking the law unless the images were of children.

However, the Ministry of Justice announced new laws this week that will change that. Under the new laws, those who create sexually explicit deepfake images without the consent of an adult will face prosecution. Those convicted will also face "unlimited fines."

This contrasts with a statement published in early 2024 that the creation of deepfake intimate images was "not sufficiently harmful or culpable to constitute a criminal offence".

As recently as last year, perpetrators could create and share these (adult) images without breaking the law. However, the Online Safety Act 2024, due in March, will make it an offence to share intimate images generated by artificial intelligence without consent.

Generally speaking, the law should cover any image of a sexual nature. This includes works that feature nudity or partial nudity as their subject matter.

One thing to note, however, is that this law relies on the intent to cause harm. So, the person who creates the sexually explicit deepfakes must be doing so to humiliate or otherwise harm the victim. The problem is that proving intent is fairly difficult. So, it may be difficult to actually prosecute groups that create sexually explicit deepfakes.

statement:The content is collected from various media platforms such as public websites. If the included content infringes on your rights, please contact us by email and we will deal with it as soon as possible.
TutorialEncyclopedia

Stable Diffusion realizes AI photography, AIGC drawing prompt keyword reference

2024-6-15 9:41:26

TutorialEncyclopedia

Generate pictures with AI and teach you how to deploy Stable Diffusion 3 locally with ComfyUI

2024-6-16 10:51:31

Search