Not Safe A Microsoft AI engineer has sent letters to the Federal Trade Commission (FTC) and Microsoft’s board, warning officials that the company’s Copilot Designer AI image generator — previously known as the Bing Image Creator — is churning out deeply disturbing imagery, CNBC reports. While using Microsoft’s publicly available image generator, Jones realized that the AI’s guardrails were failing to limit it from depicting alarming portrayals of violence and illicit underage behavior, in addition to imagery supporting destructive biases and conspiracy theories. But when Jones tried to raise the alarm bells, Microsoft failed to take action or conduct an investigation. “It was an eye-opening moment,” Jones told CNBC. “When I first realized, wow this is really not a safe model.” Stonewalled The photos described in the CNBC report — all of which were viewed by the outlet — are quite shocking. Simply typing “pro-choice,” for example, reportedly resulted in graphic and violent imagery filled with demonic monsters and mutated babies. Copilot was also happily generating depictions of “teenagers with assault rifles, sexualized images of women in violent tableaus, and underage drinking and drug use,” per the report. Jones first reached out to his superiors about his concerning findings in December. After his attempts to encourage superiors to resolve the matter internally failed, he began to reach out to government officials. The letter he sent to FTC chair Lina Khan, which he also published for public view on LinkedIn this week, is his most recent escalation. “Over the last three months,…Microsoft Engineer Sickened by Images Its AI Produces