Last week, the embattled chatbot platform Character.AI said that it would move to ban minors from conversing with its many thousands of AI companion and roleplay bots. Site users, including self-avowed minors and adults alike, have a lot of thoughts. The policy change, announced last week, comes as the controversial AI company continues to battle multiple lawsuits alleging that interactions with its chatbots caused real-world emotional and physical harm to underage users, with multiple teen users dying by suicide following extensive conversations with bots hosted by the platform. As of last week, Character.AI now says people under 18 will no longer be allowed to engage in what it refers to as “open-ended” chats, which seemingly refers to the long-form, unstructured conversations on which the service was built, where users can text and voice call back-and-forth with the site’s anthropomorphic AI-powered chatbot “characters.” Minors won’t be kicked off the site entirely; according to Character.AI, it’s working to create a distinct, presumably much more limited “under-18 experience” that offers teens some access to certain AI-generated content, though specifics are pretty vague. To enforce the shift, Character.AI says it’ll use automated in-house age verification tools as well as third-party tools to determine whether a user is under 18. By November 25, if the site determines that an account belongs to a minor, they’ll no longer be able to engage in unstructured conversations with the platform’s emotive AI chatbots, according to the company. Given that unstructured chats with platform bots have long been the…Character.AI Users in Full Meltdown After Minors Banned From Chats