An American non-profit took down its AI chatbot after a viral social media post revealed that it offered harmful advice instead of helping people.
The National Eating Disorders Association (Neda) – that says it is the largest non-profit supporting those with eating disorders – took its chatbot Tessa offline just months after it had controversially laid off four of its staff behind its support phone line after they had unionised.
The sacked employees had alleged that the non-profit wanted to replace them with the chatbot, something that it denied.
When a user reached out to Tessa for advice for recovering from an eating disorder, it recommended that she count her calories, weigh herself weekly and also suggested where she may get skin callipers to measure body fat.
Since the viral post emerged, several experts pointed out that counting calories and measuring body fat are antithetical to those recovering from eating disorders.
“Every single thing Tessa suggested were things that led to the development of my eating disorder,” activist Sharon Maxwell posted on Instagram. “This robot causes harm.”
“If I had accessed this chatbot when I was in the throes of my eating disorder, I would NOT have gotten help for my ED,” Ms Maxwell said.
When Alexis Conason, a psychologist specialising in treating eating disorders, tested out the bot, she observed that the chatbot’s responses could further “promote eating disorder”.
“Imagine vulnerable people with eating disorders reaching out to a robot for support because that’s all they have available and receiving responses that further promote the eating disorder,” Ms Conason said on Instagram.
Reacting to the incident, Neda said it is taking down the chatbot until further notice, adding that it would conduct a complete investigation.
“It came to our attention last night that the current version of Tessa chatbot running the Body Positivity program, may have given information that was harmful and unrelated to the program,” the eating disorder association said in a statement.
“Thank you to the community members who brought this to our attention and shared their experiences,” it said.
The move comes after the non-profit denied using the chatbot as a replacement for the employees it sacked in March. It was, however, reported that the organisation had planned to replace the association’s entire human-operated helpline with the Tessa chatbot starting Thursday.
Staff who were with the non-profit alleged the decision to replace humans behind the helpline with the chatbot was in retaliation against their unionisation.
Abbie Harper, a member of the Helpline Associates United union and a hotline associate, noted in a blog post that the chatbot’s use takes away the personal aspect of the support system where staff can speak from their personal experiences.
“That’s why the helpline and the humans who staff it are so important,” the blog post noted.
Commenting on the remarks, Neda’s interim chief Elizabeth Thompson told The Register that claims of the non-profit replacing its helpline service with a chatbot were untrue.
“A chatbot, even a highly intuitive program, cannot replace human interaction. We had business reasons for closing the helpline and had been in the process of that evaluation for three years,” Ms Thompson said.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies