In the wake of a scandal involving deep falsifications.MicrosoftUrgently updating its free AI software to enhance protection for its text-to-image tool Designer.
The tool was reportedly found to be similar to the one circulating on social media with Taylor Swift (Taylor Swift) Deep Fake Pictures Related. The fake photos involving a naked Swift were traced back to Microsoft's Designer AI, which then went viral on X, Reddit and other sites.
Microsoft quickly pushed out an update that added a "guardrail" to prevent the use of non-consensual photos. The popular tool, powered by OpenAI's Dall-E3, was able to block the creation of deeply fake images after the update. A Microsoft spokesperson said that they are investigating the reports and taking appropriate action to resolve the issue. According to the company's code of conduct, any user who uses Designer to create deep forgeries will lose access to the service.
Microsoft CEO Satya Nadella said tech companies need to act quickly to prevent the misuse of artificial intelligence tools. Nadella noted that the spread of fake pornographic images of the Cruel Summer singer was "shocking and horrifying" and emphasized the need for all tech platforms to act to ensure that the online world is safe for both content creators and consumers.
The deeply faked images were viewed more than 45 million times on X before they were eventually deleted some 17 hours later. Microsoft's update has sparked a series of concerns about the trend of AI deep fakes, particularly on the legal, legislative and regulatory fronts. White House press secretary Karine Jean-Pierre called the trend of deep fakes "very worrisome" and said the Biden administration would do its best to address the issue.
The controversy could have implications for artificial intelligence such as MicrosoftLeadersbrings new trouble, especially as tech bigwigs are about to testify before a Senate committee.
In addition, lawmakers in New York and New Jersey have been working to make the non-consensual sharing of AI-generated pornography a federal crime punishable by imprisonment, fines or both. A bill called the "Deep Fake Intimate Images Prevention Act" has been referred to the House Judiciary Committee, which has yet to decide whether to pass it.