After engineer’s complaints, Microsoft blocks terms that made violent, sexual images on Copilot Designer

Microsoft is working to fix its AI image generation tool, Copilot Designer after it was revealed by a company engineer, Shane Jones. Jones, tasked with testing the tool’s safety, discovered that it could be used to generate disturbing content. This included violent scenes involving teenagers, sexualized images, and biased content on sensitive topics. As we […]

Read More: After engineer’s complaints, Microsoft blocks terms that made violent, sexual images on Copilot Designer