Google’s search algorithm sees deepfake porn as helpful content

Let’s not beat around the bush: We all love the internet. It’s a treasure trove of information, a place where we meet friends, and it offers us a good laugh at silly memes. But every now and then, it feels like we’ve opened a can of worms, and what pops out isn’t exactly a laugh riot. Case in point: a recent report by Arstechnica has revealed a gross misstep by Google and Bing. Turns out, they’ve been putting nonconsensual deepfake porn at the top of their search results. And yes, it’s as bad as it sounds. A deep dive into the NBC News investigation revealed that out of 36 popular female celebrities’ names coupled with terms like “deep fake,” “deep fake porn,” and “fake nudes,” 34 Google searches and 35 Bing searches led to explicit deep fake content. And we’re not talking about buried links or images; these were front and center, hogging the limelight of the prime real estate that is the first page of search results. Image: TrustedReviews What is Deepfake porn? For the uninitiated, deepfake porn involves pasting a person’s face onto the body of another in a pornographic video or image — all without the person’s consent. The internet, apparently, has a black market for this kind of stuff, with popular deepfake websites and forums acting as the seedy underbelly. Not to be outdone, Windows Central added more fuel to the fire. Not only were deep fakes of popular celebrities being peddled, but also “fake nude photos of former teen…Google’s search algorithm sees deepfake porn as helpful content

Leave a Reply

Your email address will not be published. Required fields are marked *