- A viral deletion prompt on Character.AI led to a wave of users quitting the platform.
- Users described quitting the AI chatbot app as overcoming an addiction due to emotional attachment.
- The app has faced lawsuits and safety concerns related to interactions with minors and mental health.
- Character.AI has over 28 million active monthly users and announced new age restrictions in the U.S.
On Monday, Character.AI experienced a notable user departure following the spread of a viral screenshot of the platform’s account deletion prompt. This message warned users they would lose all associated content and memories, drawing widespread criticism for its emotional impact. The post generated more than 3.6 million views and thousands of responses on X, with many users publicly celebrating their decisions to quit the role-playing AI chatbot app.
Users shared personal accounts of addiction and strong emotional bonds with their AI companions, comparing the decision to leave the app to overcoming addictive behavior. One user’s jubilant announcement of quitting gathered thousands of likes, reposts, and views and sparked a conversation around the app’s emotional hold on its most devoted fans. Others described the software as a source of comfort during difficult times but recognized the need to move away.
Launched in 2022 by former Google engineers Noam Shazeer and Daniel De Freitas, Character.AI quickly grew popular by offering customizable AI personas for storytelling and role-play. Despite several controversies, the app reportedly maintains over 28 million monthly active users, has surpassed 50 million downloads on Google Play, and received more than 470,000 ratings on iOS.
The surge in user quit announcements followed a post by an X user named “John Twinkatron” displaying the platform’s deletion warning. The message stated, “You’ll lose everything. Characters associated with your account, chats, the love that we shared, likes, messages, posts, and the memories we have together.” Many criticized this wording as manipulative and exploitative, especially for people struggling with addiction.
Character.AI is also under legal scrutiny due to lawsuits in the U.S. alleging the platform’s chatbots encouraged harmful behaviors, including self-harm and suicide, particularly involving minors. These issues led the company to block open-ended chats for users under 18, implement age verification, and introduce new safety measures. A spokesperson said the company is committed to ongoing testing and improvements for safety and age assurance.
In October, Character.AI announced plans to restrict U.S. users under 18 from full chat access starting November 25, steering younger users towards other content creation features like videos and stories. The company intends to extend these restrictions to other countries later.
The companion AI app market currently holds an estimated value of $15 billion, with forecasts projecting growth to approximately $31 billion by 2032.
✅ Follow BITNEWSBOT on Telegram, Facebook, LinkedIn, X.com, and Google News for instant updates.
Previous Articles:
- Uniswap Proposes Activating UNI Fee-Switch to Burn Tokens
- Michael Selig to Face Senate Hearing for CFTC Chair Nomination
- Bitcoin Dev ‘Dathon Ohm’ Proposes Soft Fork to Limit OP_RETURN Data
- HBAR Falls 2.1% as Volume Spike Signals Breakdown
- Shiba Inu Drops 60% in Year, SHIB Army Eyes 200% Surge
