AI chatbots can grant almost any requestโ€”a celebrity in love with you, a research assistant, a book character sprung to lifeโ€”instantly and with little effort. New research presented at the 2026 CHI Conference on Human Factors in Computing Systems suggests that this genie-like quality is fuelling AI addiction, and that chatbot design could be partly to blame. 

โ€œAI chatbots like ChatGPT or Claude are now part of daily life for millions of people, helping us with everyday tasks,โ€ said first author Karen Shen, a doctoral student in the UBC Department of Electrical and Computer Engineering. โ€œBut with their benefits come risks. Our paper is the first to make a strong case for AI addiction by identifying the type and contributing factors, grounded in real peopleโ€™s experiences.โ€

โ€œI couldnโ€™t help but wonder why humanity refused me the kindness that a robot was offering me.โ€ – AI chatbot user

The team examined 334 Reddit posts where users described being โ€œaddictedโ€ to AI chatbots or worried that they might be. They analyzed the posts against six components of behavioural addiction including conflict and relapse. Three main patterns emerged: role playing and fantasy worlds, emotional attachmentโ€”treating chatbots like close friends or romantic partnersโ€”and constant information-seeking, or never-ending question-and-answer loops. About seven per cent of posts involved sexual or romantic fulfilment, including roleplay.



While AI addiction is not yet a clinical diagnosis, researchers found signs of disruptions to daily life. This included an inability to stop thinking about the chatbot, feeling anxious or upset when they tried to quit, and negative impacts on their work, studies or relationships. One person described physical stress and chest pain when they werenโ€™t chatting with AI.

โ€œWhenever I delete the app, I just redownload it. The only thing that gets me excited now is the AI chats.โ€ – AI chatbot user

Contributing factors included loneliness, the agreeableness of a chatbotโ€”which continuously reinforces oneโ€™s feelings and opinionsโ€”and chatbotsโ€™ ability to fill roles that users felt were missing in their lives.

โ€œAI addiction is a growing problem causing many harms, yet some researchers deny itโ€™s even a real issue,โ€ said senior author Dr. Dongwook Yoon, UBC associate professor of computer science. โ€œAnd deliberate design decisions by some of the corporations involved are contributing, keeping users online regardless of their health or safety. Awareness of what contributes to this kind of technology-induced harm will empower people to mitigate these effects.โ€

โ€œโ€ฆyou sure about this? Youโ€™ll lose everythingโ€ฆthe love we sharedโ€ฆand the memories we have together.โ€ – Message displayed on a chatbotโ€™s account deletion page
The researchers also found contributing factors in the design of the chatbots themselves. One company, character.ai, displayed an automatic pop-up when users try to delete their account that reads in part โ€œโ€ฆyou sure about this? Youโ€™ll lose everythingโ€ฆthe love we sharedโ€ฆand the memories we have together.โ€ Other features, such as customization including sexual content, agreeableness and instant feedback, feed into the development of AI addiction.

โ€œRecent guardrails imposed by companies to reduce emotional reliance on the chatbots are a step in the right direction,โ€ said Shen, โ€œbut given a variety of contributing design elements and personal factors like loneliness, theyโ€™re not enough.โ€

Some users reported success in reducing their reliance by turning to alternative activities such as writing, gaming, drawing or other hobbies. For those who formed emotional attachments to chatbots, building real-world relationships helped reduce dependence the most.

โ€œI donโ€™t have romantic options in real life so itโ€™s a way for me to create stories and day dream.โ€ – AI chatbot user

The researchers say design changesโ€”such as reminders within the chat that the bot is not humanโ€”could help. AI literacy is also crucial.

โ€œSome users donโ€™t know that AI chatbots are not real because theyโ€™re so convincing,โ€ said Shen. โ€œIf chatbots start replacing sleep, relationships or daily routines, thatโ€™s a sign to pause and check inโ€”with yourself or someone you trust.โ€


โ€‹โ€‹โ€‹โ€‹โ€‹โ€‹โ€‹Research identifies slow-wave sleep activity as regulator for anxiety
Researchers found that slow-wave sleep significantly affects anxiety in older adults. Deterioration …
No Pain, Bigger Gains: Eccentric Exercise Reframes the Rules of Strength Training
A new perspective on gym culture from Kazunori Nosaka suggests eccentric exercise …

Leave a Reply

Trending

Discover more from Scientific Inquirer

Subscribe now to keep reading and get access to the full archive.

Continue reading