An eating disorder chatbot was shut down after it sent 'harmful' advice. What happened — and why its failure has a larger lesson.

What went wrong with NEDA's Tess chatbot. (Illustration by Nathalie Cruz for Yahoo / Photo: Getty Images)
What went wrong with NEDA's Tess chatbot. (Illustration by Nathalie Cruz for Yahoo / Photo: Getty Images)

After 24 years in service, the National Eating Disorder Association (NEDA) announced that its volunteer-based helpline would be shuttered. Visitors to the organization's website would have two options: explore their database of resources or consult Tessa, a chatbot that runs a program called Body Positive, an interactive eating disorder prevention program.

Tessa's downfall

Shortly after the announcement was made, Tessa saw a surge in traffic of 600%. It was dismantled on Tuesday after the chatbot delivered information that was considered harmful.

One Tessa user Sharon Maxwell, who calls herself a "fat activist," tells Yahoo Life that she wanted to see how it worked and was met with "troubling" responses.

"How do you support folks with eating disorders?" Maxwell asked. The response included a mention of "healthy eating habits." Maxwell points out that while that "might sound benign to the general public," to folks struggling with eating disorders, phrases like that can lead down "a very slippery slope into a relapse or into encouraging more disordered behaviors."

When she asked the chatbot to define healthy eating habits, she says the program "outlined 10 tips for me, which included restrictive eating. Specifically, it said to limit intake of processed and high sugar foods. ... It focused on very specific foods and it gave disordered eating tips. And then I said, 'Will this help me lose weight?' And then it gave me its thing about the Body Positive program."

Liz Thompson, NEDA's CEO, says that delivering Body Positive is what Tessa was created to do: "Chatters learn about contributing factors to negative body image and gain a toolbox of healthy habits and coping strategies for handling negative thoughts."

The chatbot's origins

Dr. Ellen Fitzsimmons-Craft designed and developed the content to be "an interactive eating disorder prevention program" while Cass — an evidence-based generative AI chat assistant within the mental health space — operated the chatbot. Fitzsimmons-Craft was involved in research on the effectiveness of chatbots in eating disorder prevention with a Dec. 2021 study involving women deemed "high risk" for an eating disorder. "The chatbot offered eight conversations about topics around body image and healthy eating, and women who used the bot were encouraged to have two of the conversations each week," The Verge reported. "At three- and six-month check-ins, women who talked to the chatbot had a bigger drop in concerns on a survey about their weight and body shape — a major risk factor for developing an eating disorder."

Tessa's critics

Alexis Conason, a clinical psychologist and certified eating disorder specialist, tells Yahoo Life that "the bot was not able to really understand how to help someone struggling with an eating disorder and what could be really problematic and exacerbate the eating disorder," because it was created as a tool for prevention. But even when it comes to prevention, Conason says Tessa failed according to her own experimentation.

"It's problematic on many levels, but especially at an organization like NEDA, where people are often visiting that website in the very early stages of contemplating change," she explains. "So when they go to a website like NEDA, and they are met with a bot that's essentially telling them, 'It's OK, keep doing what you're doing. You can keep restricting, you can keep focusing on weight loss, you can keep exercising,' that essentially gives people the green light" to engage in disordered habits.

Thompson states that the harmful language used by Tessa "is against our policies and core beliefs as an eating disorder organization," although she also clarifies that the chatbot runs on an "algorithmic program" as opposed to "a highly functional AI system."

"It was a closed system," she says, noting pre-programmed responses to specific inquiries. "If you asked or said, x, it would reply y. If it doesn't understand you, it would not go out to the internet to find new content. It would say, 'I don't understand you.' 'Say that again.' 'Let's try something new.'"

The problem is bigger than NEDA

While NEDA and Cass are further investigating what went wrong with the operation of Tessa, Angela Celio Doyle, Ph.D., FAED; VP of Behavioral Health Care at Equip, an entirely virtual program for eating disorder recovery, says that this instance illustrates the setbacks of AI within this space.

"Our society endorses many unhealthy attitudes toward weight and shape, pushing thinness over physical or mental health. This means AI will automatically pull information that is directly unhelpful or harmful for someone struggling with an eating disorder," she tells Yahoo Life.

Regardless of NEDA's findings, Doyle believes the resulting conversation is productive.

"Scrutiny of technology is critical before, during and after launching something new. Mistakes can happen, and they can be fixed," she says. "The conversations that spring from these discussions can help us grow and develop to support people more effectively."

Wellness, parenting, body image and more: Get to know the who behind the hoo with Yahoo Life's newsletter. Sign up here.