Google Is Sending Users Straight to Nonconsensual Deepfake Porn

In an essay for The Atlantic last month, Nina Jankowicz wrote about what it was like to discover that she’d been deepfaked into pornographic material. “Recently, a Google Alert informed me that I am the subject of deepfake pornography. I wasn’t shocked,” wrote Jankowicz, the former executive director for the United States’ since-disbanded Disinformation Governance Board. “The only emotion I felt as I informed my lawyers about the latest violation of my privacy was a profound disappointment in the technology — and in the lawmakers and regulators who have offered no justice to people who appear in porn clips without their consent.” Jankowicz, like a growing number of people, was recently a victim of a disturbing and rapidly proliferating realm of pornography in which the faces of individuals — often famous celebrities, lawmakers, streamers and influencers, and other public figures, in addition to non-famous people — are inserted into pornographic content using increasingly popular and broadly available AI technologies. Though not technically illegal in most places, this content is invasive, violating, and above all else, non-consensual. (It’s also worth noting that while anyone can be impacted by deepfaked porn, it primarily impacts women.) That in mind, you might imagine that deepfaked porn, like other kinds of violating content, is difficult to find. Or at the very least, takes more than the simplest of Google searches to track it down. Sure, there’s a lot of awful stuff online, but it usually takes some shred of effort to find it. Unfortunately, that…Google Is Sending Users Straight to Nonconsensual Deepfake Porn

Leave a Reply

Your email address will not be published. Required fields are marked *