Dressed Down AI image generators that claim the ability to “undress” celebrities and random women are nothing new — but now, they’ve been spotted in monetized ads on Instagram. As 404 Media reports, Meta — the parent company of Facebook and Instagram — contained in its ad library several paid posts promoting so-called “nudify” apps, which use AI to make deepfaked nudes out of clothed photos. In one ad, a photo of Kim Kardashian was shown next to the words “undress any girl for free” and “try it.” In another, two AI-generated photos of a young-looking girl sit side by side — one with her wearing a long-sleeved shirt, another appearing to show her topless, with the words “any clothing delete” covering her breasts. Over the past six months, these sorts of apps have gained unfortunate notoriety after they were used to generate fake nudes of teen girls in American schools and Europe, prompting investigations and law proposals aimed at protecting children from such harmful AI uses. As Vice reported at the end of last year, students in Washington said they found the “undress” app they used to create fake nudes of their classmates via TikTok advertisements. Why go overseas for a nudify tool to exploit teen girls in your school when you can get them on Instagram? pic.twitter.com/Yrvw4r8F7t — SwiftOnSecurity (@SwiftOnSecurity) March 30, 2024 Takedown Request In its investigation, 404 found that many of the ads its reporters came across had been taken down from the Meta Ad Library by the time…Instagram Is Profiting Off Disgusting Apps That Undress People Without Their Consent