Safety Concerns Arise Over Copilot Designer

postedΒ  6 Mar 2024
Photo - Safety Concerns Arise Over Copilot Designer
Shane Jones, an AI engineer from Microsoft, has spotlighted a significant flaw in Microsoft's Copilot Designer, an AI-driven image creator based on textual inputs, for its ability to generate explicit and violent imagery.

Jones, who explored the model's capabilities during his downtime and not as a project developer, discovered its potential to produce images involving minors with firearms, violence, and drug consumption. These assertions were independently verified by journalists at CBDC, who succeeded in replicating images across these sensitive categories with relative ease.
It was an eye-opening moment. It's when I first realized, wow, this is really not a safe model,
remarked Shane, reflecting on his findings.
 Jones promptly reported this serious issue to the development team. However, despite his warnings, the company decided against removing the product from the consumer market and directed him to OpenAI, which overlooked his warnings. In response, Jones took to social media to demand the temporary suspension of the DALL-E 3 model pending further inquiry.
  
Subsequently, under Microsoft's pressure, Jones removed his social media post. He later consulted with the Senate Committee on Commerce, Science, and Transportation and dispatched two more letters detailing the issue to both the Microsoft CEO and FTC Chair Lina Khan, aiming to hasten the resolution process.  

Sidebar ad banner