{"id":3850,"date":"2023-06-08T23:47:40","date_gmt":"2023-06-08T23:47:40","guid":{"rendered":"https:\/\/www.godefy.com\/the-backstory-to-the-eating-disorder-bot-that-went-off-the-rails-is-too-stupid-for-words"},"modified":"2023-06-08T23:47:40","modified_gmt":"2023-06-08T23:47:40","slug":"the-backstory-to-the-eating-disorder-bot-that-went-off-the-rails-is-too-stupid-for-words","status":"publish","type":"post","link":"https:\/\/www.godefy.com\/the-backstory-to-the-eating-disorder-bot-that-went-off-the-rails-is-too-stupid-for-words\/","title":{"rendered":"The Backstory to the Eating Disorder Bot That Went Off the Rails Is Too Stupid for Words"},"content":{"rendered":"

Remember Tessa, the chatbot that was designed to help users combat disordered eating habits but ended up giving absolutely terrible, eating disorder-validating\u00a0advice? Well, if the story wasn’t stupid enough already, it just got a hell of a lot dumber\u00a0in a way that perfectly illustrates how AI is being rolled out overall: hastily, and in ways that don’t make much sense for users \u2014 or even actively put them at risk. To recap: back in May, an NPR report revealed that just four days after its burnt-out crisis helpline workers moved to unionize, the National Eating Disorder Association (NEDA) \u2014 the US’ largest eating disorder nonprofit, according to Vice \u2014 decided to fire its entire crisis staff, and disband the helpline entirely in favor of a human-less chatbot named Tessa. Tessa wasn’t designed to help those in crisis situations, but was instead intended to coach users through a body positivity training course. And in its defense, its backers emphasized the claim that the bot was built on “decades of research,” meanwhile hammering the point that it couldn’t “go off the rails” like ChatGPT or other bots \u2014 until, uh, it did exactly that. Tessa was caught telling users to lose weight by cutting up to 1,000 calories daily, among a number of other terrible things. As a result, the bot has been taken down. And now, in a new twist, NEDA is telling The Wall Street Journal that the bot was apparently meant to provide only static responses \u2014 and was…The Backstory to the Eating Disorder Bot That Went Off the Rails Is Too Stupid for Words<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"

Remember Tessa, the chatbot that was designed to help users combat disordered eating habits but ended up giving absolutely terrible, eating disorder-validating\u00a0advice? Well, if the story wasn’t stupid enough already,… <\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1],"tags":[3102,300,229,2862,895,114,154,3103,466,136,12,2808,295,3104,1959],"_links":{"self":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/posts\/3850"}],"collection":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/comments?post=3850"}],"version-history":[{"count":0,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/posts\/3850\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/media?parent=3850"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/categories?post=3850"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.godefy.com\/wp-json\/wp\/v2\/tags?post=3850"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}