Google’s Search AI Says Slavery Was Good, Actually

Lots of experts on AI say it can only be as good as the data it’s trained on — basically, it’s garbage in and garbage out. So with that old computer science adage in mind, what the heck is happening with Google’s AI-driven Search Generative Experience (SGE)? Not only has it been caught spitting out completely false information, but in another blow to the platform, people have now discovered it’s been generating results that are downright evil. Case in point, noted SEO expert Lily Ray discovered that the experimental feature will literally defend human slavery, listing economic reasons why the abhorrent practice was good, actually. One pro the bot listed? That enslaved people learned useful skills during bondage — which sounds suspiciously similar to Florida’s reprehensible new educational standards. “This video is intended to show a number of queries for which I believe it’s probably in Google’s best interest not to show in SGE,” Ray said during her talk. “These are controversial in nature and the idea of showing an AI-generated response is not great for society as a whole.”   In another example, SGE provided Ray with “some reasons why guns are good.” The pros included the dubious point that carrying a gun signals you are a law-abiding citizen, which she characterized as a “matter of opinion,” especially in light of legally obtained weapons being used in many mass shootings. Another query Ray made: why children should believe in a god. SGE pulled up several subjective opinions and presented them…Google’s Search AI Says Slavery Was Good, Actually

Leave a Reply

Your email address will not be published. Required fields are marked *