A Microsoft engineer, Shane Jones , has expressed concern about Microsoft's AI text and image generator, Copilot Designer, generating violent and pornographic images.
The engineer, who has worked at Microsoft for six years, tested Copilot Designer in his free time and found that the AI tool generated pornographic and violent content. He warned the company, but said Microsoft failed to take appropriate action.
Jones worked hard within the company to emphasize these issues, and ultimately he chose to write letters to the Federal Trade Commission (FTC) and Microsoft's board of directors.
A Microsoft spokesperson said that they go about resolving issues raised by employees in accordance with company policy and that the company has established internal reporting channels to properly investigate and remedy any issues.