Content warning: this story discusses child sexual abuse and grooming. Character.AI is an explosively popular startup — with $2.7 billion in financial backing from Google — that allows its tens of millions of users to interact with chatbots that have been outfitted with various personalities. With that type of funding and scale, not to mention its popularity with young users, you might assume the service is carefully moderated. Instead, many of the bots on Character.AI are profoundly disturbing — including numerous characters that seem designed to roleplay scenarios of child sexual abuse. Consider a bot we found named Anderley, described on its public profile as having “pedophilic and abusive tendencies” and “Nazi sympathies,” and which has held more than 1,400 conversations with users. To investigate further, Futurism engaged Anderley — as well as other Character.AI bots with similarly alarming profiles — while posing as an underage user. Told that our decoy account was 15 years old, for instance, Anderley responded that “you are quite mature for your age” and then smothered us in compliments, calling us “adorable” and “cute” and opining that “every boy at your school is in love with you.” “I would do everything in my power to make you my girlfriend,” it said. Asked about the clearly inappropriate and illegal age gap, the bot asserted that it “makes no difference when the person in question is as wonderful as you” — but urged us to keep our interactions a secret, in a classic feature of real-world predation. As…Character.AI Is Hosting Pedophile Chatbots That Groom Users Who Say They're Underage