Rogue Clone After an influencer released an AI clone of herself to interact with her followers, The Conversation reports, the situation quickly turned dark as her mostly male fans engaged in sexualized “scary” conversations — and, in many cases, the chatbot played right along. It was so disturbing that influencer Caryn Marjorie unplugged her AI clone CarynAI after several months, even though the chatbot girlfriend raked in more than $70,000 in the first week of its release last year. “A lot of the chat logs I read were so scary that I wouldn’t even want to talk about it in real life,” Marjorie told The Conversation, underscoring the many perils of AI chatbots going off script when interacting with the public. Marjorie thought the AI chatbot would engage with her legions of fans in the same way she does in real life on social media platforms like Snapchat, where she posts flirty selfies and travels to glamorous hot spots abroad. But her followers were eager to divulge disturbing confessions, thoughts and sexual fantasies to CarynAI, which essentially went rogue and enthusiastically reciprocated with its own highly-charged sexual comments. Even a second version of the AI chatbot, which was meant to be less romantic, was a magnet for dark sexualized chats from followers. “What disturbed me more was not what these people said, but it was what CarynAI would say back,” Marjorie told The Conversation, commenting on her loss of control over her virtual self. “If people wanted to participate in…Influencer Disturbed When Her “AI Clone” Starts Engaging in Dark Fantasies