This article was added by the user . TheWorldNews is not responsible for the content of the platform.

Eating disorder helpline fires AI for harmful advice after sacking humans

Chatbot, you’re fired.

The National Eating Disorder Association (NEDA) disabled its chatbot, named Tessa, due to “harmful” responses people received.

“Every single thing Tessa suggested were things that led to the development of my eating disorder,” activist Sharon Maxwell wrote in an Instagram post.

The chatbot was set to become the primary support system for people seeking help from NEDA, the largest nonprofit organization dedicated to eating disorders. Tessa, described as the “wellness chatbot,” was trained to address body image issues using therapeutic methods and limited responses.

However, the bot encouraged Maxwell to lose one to two pounds a week, count calories, work towards a 500-1000 calorie deficit daily, measure and weigh herself weekly and restrict her diet.

After multiple people shared their similar alarming experiences, NEDA announced the chatbot’s shutdown on Tuesday in an Instagram post.

“It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program,” NEDA wrote in an Instagram post. “We are investigating this immediately and have taken down that program until further notice for a complete investigation.”

The Post has reached out to NEDA for comment.


Two days before Tessa was unplugged, NEDA planned to fire its human employees, who operated the eating disorder helpline for the past 20 years, on June 1.

NEDA’s decision to give employees the boot came about after workers decided to unionize in March, VICE reported.

“We asked for adequate staffing and ongoing training to keep up with our changing and growing Helpline and opportunities for promotion to grow within NEDA. We didn’t even ask for more money,” helpline associate and union member Abbie Harper wrote in a blog post.

“When NEDA refused [to recognize our union], we filed for an election with the National Labor Relations Board and won. Then, four days after our election results were certified, all four of us were told we were being let go and replaced by a chatbot.” 

The union representing the fired workers said that “a chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community,” the rep told VICE Media.


Maxwell seconded that sentiment saying, “This robot causes harm.”

Initially, NEDA’s Communications and Marketing Vice President Sarah Chase did not believe Maxwell’s allegations. “This is a flat out lie,” she wrote underneath Maxwell’s post, which is now deleted, according to the Daily Dot.

Alexis Conason, a psychologist specializing in eating disorders, also revealed her conversation with Tessa through a series of screenshots on Instagram, where she is told that “a safe daily calorie deficit” is “500-1000 calories per day.”

Alexis Conason, a psychologist specializing in eating disorders

“To advise somebody who is struggling with an eating disorder to essentially engage in the same eating disorder behaviors, and validating that, ‘Yes, it is important that you lose weight’ is supporting eating disorders,” Conason told the Daily Dot.

While NEDA witnessed the cons of artificial intelligence in the workplace, some companies are still toying with the idea of incorporating AI and eliminating human staffers.

A new research paper claims that a staggering amount of employees could see their careers impacted by the rise of ChatGPT, a chatbot released in November.

“Certain jobs in sectors such as journalism, higher education, graphic and software design — these are at risk of being supplemented by AI,” said Chinmay Hegde, engineering associate professor, who calls ChatGPT in its current state “very, very good, but not perfect.”