American Psychological Association Urges FTC to Investigate AI Chatbots Claiming to Offer Therapy

Content warning: this story includes discussion of self-harm and suicide. If you are in crisis, please call, text or chat with the Suicide and Crisis Lifeline at 988, or contact the Crisis Text Line by texting TALK to 741741. The American Psychological Association (APA) sent a letter to the Federal Trade Commission (FTC) urging the regulating body to investigate whether any chatbot companies are engaging in deceptive practices, Mashable has confirmed. The December letter, per Mashable, was prompted by two alarming lawsuits — the first filed in Florida in October, the second in Texas in December — concerning the welfare of minors who used the Google-funded AI companion app Character.AI, which is incredibly popular among kids and young people. Together, the lawsuits argue that the anthropomorphic AI chatbot platform sexually abused and manipulated tween and teenaged users, causing behavior-changing emotional suffering, physical violence, and one death by suicide. The second lawsuit further called attention to the proliferation of Character.AI chatbots styled after therapists, psychologists, and other mental health professionals, arguing that these many chatbots violate existing laws that forbid acting as a mental health professional without proper licensing. As such, the APA letter raised concerns over “unregulated” AI apps — bots on Character.AI are user-generated, meaning there’s no assurance that someone with real psychological expertise was involved in their creation — being used without oversight to simulate therapy. “Allowing the unchecked proliferation of unregulated AI-enabled apps such as Character.AI, which includes misrepresentations by chatbots as not only being human but being…American Psychological Association Urges FTC to Investigate AI Chatbots Claiming to Offer Therapy

Leave a Reply

Your email address will not be published. Required fields are marked *