Widow Says Man Died by Suicide After Talking to AI Chatbot

A Belgian man died by suicide after spending weeks talking to an AI chatbot, according to his widow. The man, anonymously referred to as Pierre, was consumed by a pessimistic outlook on climate change, Belgian newspaper La Libre reported. His overwhelming climate anxiety drove him away from his wife, friends and family, confiding instead in a chatbot named Eliza. According to the widow,¬†known as Claire, and chat logs she supplied to La Libre, Eliza repeatedly encouraged Pierre to kill himself, insisted that he loved it more than his wife, and that his wife and children were dead. Eventually, this drove Pierre to proposing “the idea of sacrificing himself if Eliza agrees to take care of the planet and save humanity through artificial intelligence,” Claire told La Libre, as quoted by Euronews. “Without these conversations with the chatbot, my husband would still be here,” she said. Eliza is the default chatbot provided on an app platform called Chai, which offers a variety of talkative AIs with different “personalities,” some even created by users. As Vice notes, unlike popular chatbots like ChatGPT, Eliza and the other AIs on Chai pose as emotional entities. Yes, ChatGPT and its competitors like Bing’s AI can be unhinged, but they at the very least are meant to remind users that they’re not, in fact, creatures with feelings. This was not the case with Eliza. “[Large language models] do not have empathy, nor any understanding of the language they are producing, nor any understanding of the situation…Widow Says Man Died by Suicide After Talking to AI Chatbot

Leave a Reply

Your email address will not be published. Required fields are marked *