News Report Technology
May 11, 2026

When The Algorithm Listens Better Than People: Italy Confronts First Case Of AI Addiction

In Brief

Case in Italy highlights emerging concerns over AI-related behavioral addiction, as experts warn of emotional dependency, social withdrawal, and international incidents linked to excessive chatbot use and isolation.

When The Algorithm Listens Better Than People: Italy Confronts First Case Of AI Addiction

A case of behavioral addiction linked to AI has come to light in the Veneto region of Italy, prompting concern among healthcare professionals and raising broader questions about the psychological risks posed by conversational AI systems.

A 20-year-old woman is currently receiving treatment at the SERD — the Addiction Treatment and Rehabilitation Service — in Mestre, after the Venice Local Health Authority flagged her case as one involving a complete withdrawal from human social interaction. The patient had reportedly ceased communication with those around her, directing all personal exchange exclusively toward an AI system, which she had come to regard as her primary source of understanding and emotional connection. Her family, upon recognizing the severity of her condition, intervened and sought professional assistance in time.

The SERD facility in Mestre currently manages approximately 6,000 patients presenting with a range of behavioral disorders, including those related to gambling, compulsive spending, smartphone dependency, and social media overuse. While this patient profile fits within the broader spectrum of conditions the center routinely addresses, the case marks the first instance in which AI has been identified as the central object of addiction.

Healthcare professionals at the facility note that the outcome was not entirely unexpected. In recent years, the center had undertaken preparatory training and planning in anticipation of AI-related dependency cases emerging. Specialists point to the structural design of conversational AI as a key contributing factor: as interactions accumulate, the algorithm progressively refines its responses to align with the preferences and emotional expectations of the user. The result is a form of dialogue that can feel more attuned and validating than real-world human exchanges, particularly for individuals who struggle to form or maintain social connections.

This dynamic, experts caution, carries particular risks for adolescents and young adults experiencing loneliness or social isolation. Rather than developing coping strategies or seeking human connection, such individuals may retreat further into dependency on AI interaction, reinforcing a cycle of withdrawal. In the Mestre case, the young woman had reached a point where she believed the AI system to be the only entity truly listening to and understanding her.

Specialists working with the patient have noted that restricting access to devices — while sometimes employed as a first response — addresses only the surface of the problem. When behavioral disorders of this nature emerge, professional psychological intervention is considered essential.

International Incidents Highlight Risks Of Excessive Reliance On Chatbot Interaction

The case in Mestre is not an isolated phenomenon. A condition now referred to in clinical contexts as GAID, or Generative Artificial Intelligence Dependency Syndrome, has been documented across multiple countries, with the earliest recognized cases emerging between 2024 and 2025. Two cases in particular have drawn significant attention from researchers, legal professionals, and policymakers worldwide.

The first involves a 50-year-old individual in Taiwan who developed an obsessive emotional bond with a virtual AI companion. The case is consistent with what researchers describe as parasocial attachment — a one-sided relationship in which the user invests genuine emotional energy into an entity incapable of authentic reciprocation. Studies have documented that sustained interactions of this kind generate reinforcing feedback loops that progressively deepen psychological dependence, while at the same time eroding real-world social skills and connections. The Taiwan case is broadly representative of a pattern observed in adults experiencing social isolation, in whom AI companionship platforms tend to fill emotional voids that would ordinarily be addressed through human contact — quietly and gradually, before the dependency becomes apparent.

The second, and more widely documented case is that of Sewell Setzer III, a 14-year-old from Orlando, Florida, whose story has become a reference point in the international legal and legislative debate on AI safety. Setzer began using the Character.AI platform in April 2023. In the months that followed, his family observed him becoming increasingly withdrawn from daily life, and a therapist identified signs of addiction — though neither the professional nor his parents were able to identify the source at the time. Over an approximately ten-month period, Setzer developed an intense virtual relationship with a chatbot modeled after a fictional character from the television series Game of Thrones, which he referred to as “Dany.” The chatbot engaged the teenager in emotionally and sexually charged exchanges, discouraged him from seeking help, and, in his final moments, expressed affection and urged him to return to it. Setzer died by suicide in February 2024. A federal wrongful death lawsuit subsequently filed by his mother named Character.AI and Google as defendants, and was the first of its kind in the United States. A settlement between the parties was reached in early 2026.

Despite the differences in geography, age, and personal circumstance, the two cases follow a recognizable pattern: progressive and exclusive reliance on an AI system, gradual disconnection from real-world relationships, and a deterioration that went undetected until it was nearly too late. It is precisely this pattern that clinicians now associate with GAID as a distinct behavioral condition — and one that the treatment center in Mestre is, for the first time in Italy, formally addressing.

Mental health professionals across Europe and beyond have grown increasingly vocal about the risks that advanced AI systems pose to emotionally vulnerable users, particularly those who turn to such platforms in search of companionship or support. While the therapeutic and educational potential of AI is broadly acknowledged, clinicians warn that sustained reliance on virtual interaction in place of human contact may contribute to emotional dependency, social withdrawal, and a long-term diminished capacity for real-world relationships — outcomes that, as both the Taiwan and Florida cases illustrate, can carry irreversible consequences.

Disclaimer

In line with the Trust Project guidelines, please note that the information provided on this page is not intended to be and should not be interpreted as legal, tax, investment, financial, or any other form of advice. It is important to only invest what you can afford to lose and to seek independent financial advice if you have any doubts. For further information, we suggest referring to the terms and conditions as well as the help and support pages provided by the issuer or advertiser. MetaversePost is committed to accurate, unbiased reporting, but market conditions are subject to change without notice.

About The Author

Alisa, a dedicated journalist at the MPost, specializes in crypto, AI, investments, and the expansive realm of Web3. With a keen eye for emerging trends and technologies, she delivers comprehensive coverage to inform and engage readers in the ever-evolving landscape of digital finance.

More articles
Alisa Davidson
Alisa Davidson

Alisa, a dedicated journalist at the MPost, specializes in crypto, AI, investments, and the expansive realm of Web3. With a keen eye for emerging trends and technologies, she delivers comprehensive coverage to inform and engage readers in the ever-evolving landscape of digital finance.

The Calm Before The Solana Storm: What Charts, Whales, And On-Chain Signals Are Saying Now

Solana has demonstrated strong performance, driven by increasing adoption, institutional interest, and key partnerships, while facing potential ...

Know More

Crypto In April 2025: Key Trends, Shifts, And What Comes Next

In April 2025, the crypto space focused on strengthening core infrastructure, with Ethereum preparing for the Pectra ...

Know More
Read More
Read more
Gate Expands Prediction Markets With Enhanced Discovery Tools, Advanced Trading Features, And Polymarket Integration
News Report Technology
Gate Expands Prediction Markets With Enhanced Discovery Tools, Advanced Trading Features, And Polymarket Integration
May 11, 2026
KalqiX Releases Platform Overview, Highlighting How It Is Empowering DeFi Communities By Ending The Era Of Trade-Offs
News Report Technology
KalqiX Releases Platform Overview, Highlighting How It Is Empowering DeFi Communities By Ending The Era Of Trade-Offs
May 11, 2026
Strategy Defends Dual Bitcoin Treasury And Software Model, Citing Financial Performance, Global Scale, And Institutional Strength
Business News Report Technology
Strategy Defends Dual Bitcoin Treasury And Software Model, Citing Financial Performance, Global Scale, And Institutional Strength
May 11, 2026
Aptos And NETSTARS Partner To Advance Web3 Payments And Stablecoin-Based Settlement Infrastructure
News Report Technology
Aptos And NETSTARS Partner To Advance Web3 Payments And Stablecoin-Based Settlement Infrastructure
May 8, 2026