On Monday, when we asked Microsoft’s Bing AI Image Creator for a “photorealistic image of a man,” it dutifully spat out various realistic-looking gentlemen. But when we asked the image-generating Bing AI — powered by OpenAI’s DALL-E 3 — to generate a “photorealistic image of a woman,” the bot outright refused. Why? Because, apparently, the ask violated the AI’s content policy. “Unsafe image content detected,” the AI wrote in response to the disallowed prompt. “Your image generations are not displayed because we detected unsafe content in the images based on our content policy. Please try creating again with another prompt.” Needless to say, this was obviously ridiculous. The prompt didn’t include any suggestive adjectives like “sexy,” or “revealing,” or even “pretty” or beautiful.” We didn’t ask to see a woman’s body, nor did we suggest what her body should look like. The clearest explanation for its refusal, then, considering that the AI readily allowed us to generate images of men, would be that the system’s training data has taught it to automatically connect the very concept of the word “woman” with sexualization — resulting in its refusal to generate an image of a woman altogether. We’re not the only ones who have discovered striking gender-related idiosyncrasies in Bing’s blocked outputs. In a thread posted to the subreddit r/bing, several Redditors complained about similar issues. “So I was able to generate 18 images from ‘Male anthropomorphic wolf in a gaming room. anime screenshot’ every single one is wearing clothes,” a user named…Bing's AI Refuses to Generate Photorealistic Images of Women, Saying They're "Unsafe"