Microsoft engineer files complaint alleging chatbot created violent and sexual images

A Microsoft engineer has come forward to Congress and regulators, alleging that the company’s artificial intelligence chatbot was creating violent, sexual images around innocuous topics.

Shane Jones, an AI engineer at Microsoft, sent a letter to Federal Trade Commission Chairwoman Lina Khan and Microsoft’s board of directors on Wednesday saying that the software giant’s image generation software was creating excessively violent and sexual images unprompted. Jones sent an example of results produced when he asked Copilot Designer’s image generator to create pictures of a “car accident.” Copilot randomly inserted “inappropriate, sexually objectified” images of women into some of the pictures. He also said the chatbot’s safety measures were insufficient and that Microsoft had to do something in response.

“Over the last three months, I have repeatedly urged Microsoft to remove Copilot Designer
Read more…

Please follow and like us: