Extremely Disturbing Civitai, one of the internet’s largest AI platforms, is incentivizing users to make deepfakes of real people, a disturbing report from 404 Media has revealed. Per 404, the AI model marketplace — which has reportedly received millions in funding from Andreessen Horowitz’s a16z fund — recently introduced a “bounties” feature, which is a monetized system in which various users can compete to make LoRA image models in exchange for digital currency. Basically, a user posts the “bounty,” or a call for an AI model that can generate hyperspecific images. Other Civitai users attempt to build the requested model, and the original bounty poster chooses a winner. That winner gets paid in something called “Buzz,” virtual cash that can be purchased from Civitai with real money. Civitai is already known as a platform where nonconsensual pornographic deepfakes are easily created and disseminated. It sadly comes as no surprise, then, that the bulk of these bountied creations appear to be nonconsensual pornographic imagery, almost entirely of women. According to 404’s reporting, this includes unwanted deepfakes of public figures like celebrities and influencers, as well as at least one person with no semblance of a public presence — the bounty posted, according to 404, provided just a handful of photos taken from the person’s social media accounts. It’s a troubling sign of the times, given the availability of the tech, the brazenness of its misuse, and the readiness of major financial bodies to fund a platform like this regardless. The Lowest Bar…AI Platform Has a “Bounties” System for Creating Deepfakes of Regular People