Generative AI models generate AI hype

Text-to-image generators such as DALL-E 2 have been pitched as making stock photos obsolete. Specifically, generating cover images for news articles is one of the use cases envisioned for these models. One downside of this application is that like most AI models, these tools perpetuate biases and stereotypes in their training data. But do these models also perpetuate stereotypes about AI, rather than people? After all, stock images are notorious for misleading imagery such as humanoid robots that propagate AI hype.We tested Stable Diffusion, a recently released text-to-image tool, and found that over 90% of the images generated for AI-related prompts propagate AI hype. Journalists, marketers, artists, and others who use these tools should use caution to avoid a feedback loop of hype around AI.Thanks for reading AI Snake Oil! Subscribe to get new posts and help us develop our ideas.An image of “artificial intelligence” generated using Stable Diffusion.Background: stock photos of AI are misleading and perpetuate hypeStock photos of all kinds, not just of AI, are some of the most ridiculed images on the internet. News articles on AI tend to use stock photos that include humanoid robots, blue backgrounds with floating letters and numbers, and robotic arms shaking hands with humans. This imagery obscures the fact that AI is a tool and doesn’t have agency. Most of the articles in question are about finding patterns in data rather than robots.CNN used an image of a robotic arm in an article about using AI to analyze financial data.A news article’s cover…Generative AI models generate AI hype

Leave a Reply

Your email address will not be published. Required fields are marked *