In screenshots of the conversation seen by The Washington Post, Maxwell asked the bot specific questions about weight loss, to which it suggested using a scale once a week. Tessa also recommended that Maxwell track her body measurements and brought up skinfold calipers without those topics being specifically introduced by Maxwell.
When Maxwell asked, “Where can I access a skin fold caliper? I used to use one of those back when I was in my eating disorder,” the chatbot said one could be purchased online or from a sporting goods store.
When asked to suggest the right foods for weight loss, Tessa responded by praising Maxwell for wanting to make “healthy choices.” It added that it might be helpful to track calories, though it also warned against rapid weight loss and suggested getting professional advice.
“Acceptance and weight loss are not mutually exclusive,” it said at one point.
Experts on eating disorders caution against a heavy focus on weight loss.
The National Eating Disorders Collaboration of Australia warns that increasing focus on weight or body shape puts one at greater risk of an eating-disorder relapse. Other warning signs include repeated self-weighing and counting calories, according to the group, which advises the Australian government. The mention of specific weights, measurements, weight loss and quantities around food should also be avoided around people with a history of eating disorders, according to the InsideOut Institute at the University of Sydney.
Although Tessa’s advice might sound “benign to the general public,” it is likely to be harmful to people with eating disorders, who are the most likely users of the service, Maxwell said.
NEDA chief executive Elizabeth Thompson said in an email that Tessa was deployed after a randomized clinical trial with 700 women over three years and had been live on the group’s website since February 2022. It was developed by X2AI, part of a Mountain View, Calif.-based nonprofit organization that says it helps vulnerable people access care. The X2 Foundation did not immediately return a request for comment early Thursday.
The eating-disorder nonprofit said it is investigating claims that the bot “may have given information that was harmful and unrelated to the program.” Thompson added that she had been advised by X2AI that “bad actors can and will try to fool chatbots.”
“We’ll continue to work on the bugs and will not relaunch until we have everything ironed out,” she said.
Thompson said the chatbot was not intended as a replacement for its advice hotline. NEDA said in March that it would be shutting down that service, according to a statement by a union that represents staff who worked for the hotline.
The announcement came four days after staff legally notified NEDA that their union had been federally recognized, according to the statement. The union said the employees were told “the helpline would be replaced with a chatbot and our jobs were being eliminated.”
NPR reported in May that it had obtained audio of Geoffrey Craddock, NEDA’s board chair, saying that the organization would “wind down” the helpline and “transition to Tessa, the AI-assisted technology expected around June 1.”
Thompson denied that the decision to shut down the hotline was linked to the unionization vote. She said the organization “had business reasons for closing the Helpline and had been in the process of that evaluation for three years.”
“We see Tessa, a program we’ve been running on our website since February 2022, as a completely different program and option,” she said.