Google Caught Taking Money to Promote AI Apps That Create Nonconsensual Nudes

False Profit Google is taking money to promote AI apps that produce nonconsensual deepfake nudes, new reporting from 404 Media reveals. As caught by 404, searches for terms like “undress apps” and “best deepfake nudes” display paid advertisements for websites offering services like “NSFW AI Image Generator” and other readily available AI tools that can be used to create explicit imagery of real people without their consent. Google has drawn widespread criticism over its failure to curb the proliferation of AI deepfakes of real people in its search results, which have historically been ridiculously easy — as in, one-search-query-and-one-click-away-level easy — to find on the search giant’s platform. In response to this criticism, Google just last week announced that it would expand existing search policies to “help people affected” by the “non-consensual sexually explicit fake content” that crops up in its search pages. But 404’s reporting reveals that Google’s deepfake problem also exists on its ad side, where the search giant is actively profiting from promoted posts advertising some of the exact same AI services that help bad actors make invasive and nonconsensual explicit content in the first place. Pro Active Google has reportedly taken action to delist the specific advertisements and websites flagged by 404’s journalists, with a spokesperson for Google telling the outlet that services designed “to create synthetic sexual or nude content are prohibited from advertising through any of our platforms or generating revenue through Google Ads.” Per 404, the spokesperson added that the search giant is “actively…Google Caught Taking Money to Promote AI Apps That Create Nonconsensual Nudes

Leave a Reply

Your email address will not be published. Required fields are marked *