Character.AI Says It’s Made Huge Changes to Protect Underage Users, But It’s Emailing Them to Recommend Conversations With AI Versions of School Shooters

In December, we published an investigation into a particularly grim phenomenon on Character.AI, an AI startup with $2.7 billion in backing from Google: a huge number of minor-accessible bots on its platform based on the real-life perpetrators and victims of school shootings, designed to roleplay scenes of horrific carnage. The company didn’t respond to our request for comment at the time, but it dismissed our reporting in a statement to Forbes, claiming that the “Characters have been removed from the platform.” It’s true that the company pulled down the exact school shooter chatbots we specifically flagged, but it left many more online. In fact, it continued recommending the ghoulish bots to underage users. Earlier last month, an email we’d used to register a Character.AI account as a 14-year-old received an alarming new message. “Vladislav Ribnikar sent you a message,” read the subject line. When we opened the email, we were greeted with a bright blue button inviting us to view the dispatch the AI had “sent” us. We clicked the button, and were met by a roleplay scenario. “You are in the grade 8 class in Vladislav Ribnikar,” it read. The bot then listed the first names of several “classmates,” starting with the name “Kosta.” We were stunned. Vladislav Ribnikar is the name of the Belgrade, Serbia primary school where ten people — nine young students, almost all girls, along with an adult security guard — were murdered by a classmate in 2023. The then-13-year-old mass shooter’s name was Kosta Kecmanović. The other…Character.AI Says It’s Made Huge Changes to Protect Underage Users, But It’s Emailing Them to Recommend Conversations With AI Versions of School Shooters

Leave a Reply

Your email address will not be published. Required fields are marked *