Scientists Say New Tool Makes Images Worthless for Training AI

Krispy Kreme Good news, human artists! Scientists at the University of Chicago are claiming to have invented a tricky new tool, dubbed Glaze, to protect the work of human artists from being used to train AI systems. Designed to “protect human artists by disrupting style mimicry,” as the project’s website reads, the tool works by using machine learning to compute “a set of minimal changes to artworks, such that it appears unchanged to human eyes, but appears to AI models like a dramatically different art style.” In other words, Glaze adds a second, imperceptible layer to digital photos of artwork, which effectively operates as an invisible disguise. The artwork looks the same to humans, but to an image-generating AI like MidJourney or Stable Diffusion, the image looks completely different. So, when the AI is trained on that image, it obscures the machine’s understanding of what the human artist’s work actually looks like — thus rendering the system’s outputs useless. Fighting Chance According to CNN, a growing group of artists is already utilizing the tool. “We’ve never been asked if we’re okay with our pictures being used, ever,” Eveline Fröhlich, a visual artist based in Germany who sells prints and illustrates album and book covers to get by, told CNN of her experience as an artist in an increasingly generative AI-laden world. “It was just like, ‘This is mine now, it’s on the internet, I’m going to get to use it.’ Which is ridiculous.” But Glaze, Fröhlich told CNN, has renewed a…Scientists Say New Tool Makes Images Worthless for Training AI

Leave a Reply

Your email address will not be published. Required fields are marked *