Tessa - HARMFUL ADVICE
2023
The National Eating Disorders Association (NEDA) launched an AI chatbot named Tessa as a replacement for its human-staffed helpline. This decision came shortly after the helpline employees announced their intention to unionize.
Almost immediately after its launch, the chatbot was found to be giving harmful advice. The advice included recommendations of daily calorie deficit of 500-1000 calories, suggesting regular weekly weigh-ins, encouraging users to count calories and recommending the use of skin calipers to measure body fat. While some of this advice is good for most of us, it is terrible advice for people with eating disorders. This advice directly contradicts best practices for eating disorder recovery and could actually increase disordered behaviors.
After a public outcry, NEDA issued a statement acknowledging that the chatbot "may have given information that was harmful" and unplugged the bot for an investigation. As of now, the chatbot remains disabled, and the page for Tessa has been removed from NEDA's website. The incident sparked a broader conversation about the readiness of AI for sensitive mental health applications and the ethical implications of using technology to replace human empathy and support.
Shame that organizations are so bad at learning from failure. Only five years earlier the mighty IBM corporation’s Dr. Watson failed for similar reasons.
Additional info:
Psychiatrist.com - NEDA Suspends AI Chatbot for Giving Harmful Eating Disorder Advice
The Guardian - US eating disorder helpline takes down AI chatbot over harmful advice
CNN Business - AI chatbot offline after complaints of ‘harmful’ advice
BBC - Eating disorder group pulls chatbot sharing diet advice