Text-to-image models like Stable Diffusion and DALL-E are trained on vast amounts of data collected online—including images created by artists, without their consent. In response, artists developed the website Have I been trained, which allows any artist to check whether their art was used to train Stable Diffusion. It also allows artists to individually mark their images as “opted-out” of training data for generative AI. Stability AI said that it would honor opt-out requests made using this tool. Over 80 million artworks have been opted out in the last few months. But will this help artists?Opt-outs are an ineffective governance mechanismGiving artists the choice to opt out seems compelling. But it is unlikely to be effective on its own. The law does not bind developers of generative AI tools to respect opt-outs. (UPDATE: The creators of Have I Been Trained point out that the EU’s 2019 copyright directive prohibits companies from training ML models using opted-out content. This suggests that companies might be legally required to respect artists’ opt-out choices, at least in Europe.) None of the other major players—OpenAI, Google, and Midjourney—offer artists the choice of opting out.Many artists are unaware that their art is used to train AI models. On top of that, the process for opting out is time-consuming, opaque, and unintuitive. It requires artists to upload their images one at a time and opt the resulting images out individually.1Considering all this, the number of images opted out is likely a small fraction of what it would have been if developers needed to…Artists can now opt out of generative AI. It’s not enough.