The National Eating Disorder Helpline has made the decision to replace its human staff with an AI chatbot.
The helpline’s staff union strongly condemns the decision, expressing disappointment with the organization.
The Trust Project is a worldwide group of news organizations working to establish transparency standards.
To improve your local-language experience, sometimes we employ an auto-translation plugin. Please note auto-translation may not be accurate, so read original article for precise information.
The National Eating Disorder Association has decided to shut down its longstanding telephone helpline. From June 1, NEDA will let go of a small team responsible for managing and operating the helpline. Instead, the organization will introduce an AI-driven chatbot called “Tessa” to assist individuals seeking support. The move comes after the employees running the eating disorder helpline formed a union, leading to their dismissal by NEDA.
NEDA’s staff was notified of their termination and the helpline’s closure only four days after they announced a union earlier this month. The workers’ group, Helpline Associates United, claims that NEDA is punishing them for unionizing.
The association’s spokesperson Chase told Gizmodo that NEDA has moved on from the helpline, which was established in 1999 but is now considered obsolete due to the internet. Instead, the organization aims to enhance the online experience and plans to launch an updated website by the end of 2023.
NEDA wrote that approximately 70,000 individuals sought assistance from the helpline managed by the association last year. During the COVID-19 crisis, the number of people seeking help more than doubled and never returned to pre-pandemic levels.
The workers’ union condemns the decision: “A chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community.”
Helpline workers provide a vital service to people struggling with various issues and need someone to listen and offer support. Replacing them with an AI chatbot raises significant concerns. Not only does an AI chatbot potentially lack the ability to empathize with people and understand their emotions, but it also may struggle to handle complex or sensitive situations that need human judgment and intervention. An AI chatbot is incapable of establishing trust or making people feel valued and respected; on the other hand, it could worsen issues with incorrect automated responses.
- EU Reaches Early Agreement on AI Act with Focus on Transparency and Risk Mitigation
- Disney Reportedly Shuts Down its Metaverse Division to Cut Operating Expenses
- NBA to Introduce Association NFT
Any data, text, or other content on this page is provided as general market information and not as investment advice. Past performance is not necessarily an indicator of future results.