Poison Pill Being an artist in today’s world means that you can’t just be worried about plain ol’ plagiarism and duplicitous copycats — you now have to be mindful about generative AI models cribbing your work, too. Thankfully, a new tool called Nightshade can not only purportedly protect your images from being mimicked by AI models, but also “poison” them by feeding them misleading data. First teased in late 2023, its developers announced on Friday that a finished version of Nightshade is finally available for download. It’s the latest sign of artists hardening their stance against AI image generators like Stable Diffusion and Midjourney, which were trained on their works without permission or compensation. Artist Counterattack Nightshade’s developers, a team of computer scientists at the University of Chicago, say their software is meant to be “an offensive tool” where its predecessor Glazer was designed to be a defensive one. They still recommend using both tools. Glaze works by subtly modifying — “glazing” — an image at the pixel level. These changes are largely imperceptible to the naked eye, “like UV light” in the developers’ words, but are clearly visible to AI models, which see imagery differently. The overall effect obfuscates an image’s content to an AI. Nightshade takes this a step further. In “shading” an image, the tool also introduces subtle changes, but these alterations can cause an AI model to incorrectly identify what it’s seeing. A human might see a “cow in a green field,” the developers wrote, “but…Software Released to Make Your Original Art Poison AI Models That Scrape It