Character.AI is banning minors from interacting with its chatbots
Published: 30 Oct. 2025, 09:55
Bruce Perry, 17, demonstrates the possibilities of artificial intelligence by creating an AI companion on Character.AI, July 15, in Russellville, Ark. [AP/YONHAP]
Character Technologies, the Menlo Park, California-based company behind Character.AI, said Wednesday it will be removing the ability of users under 18 to participate in open-ended chats with AI characters. The changes will go into effect by Nov. 25 and a two-hour daily limit will start immediately. Character.AI added that it is working on new features for kids — such as the ability to create videos, stories and streams with AI characters. The company is also setting up an AI safety lab.
Character.AI said it will be rolling out age-verification functions to help determine which users are under 18. A growing number of tech platforms are turning to age checks to keep children from accessing tools that aren't safe for them. But these are imperfect, and many kids find ways to get around them. Face scans, for instance, can't always tell if someone is 17 or 18. And there are privacy concerns around asking people to upload government IDs.
Character.AI, an app that allows users to create customizable characters or interact with those generated by others, spans experiences from imaginative play to mock job interviews. The company says the artificial personas are designed to “feel alive” and “humanlike.”
“Imagine speaking to super-intelligent and lifelike chatbot Characters that hear you, understand you and remember you,” reads a description for the app on Google Play. “We encourage you to push the frontier of what’s possible with this innovative technology.”
Jacob Andreou, CVP, Product and Growth, Microsoft AI, introduces Mico, short for Microsoft Integrated Companion, the new Microsoft Copilot, a memory-based AI assistant during Microsoft's Fall 2025 Copilot Sessions event on Oct. 22, in Los Angeles. [AP/YONHAP]
“They have not addressed how they will operationalize age verification, how they will ensure their methods are privacy preserving, nor have they addressed the possible psychological impact of suddenly disabling access to young users, given the emotional dependencies that have been created,” Jain said. “Moreover, these changes do not address the underlying design features that facilitate these emotional dependencies — not just for children, but also for people over the age of 18 years.”
More than 70 percent of teens have used AI companions and half use them regularly, according to a recent study from Common Sense Media, a group that studies and advocates for using screens and digital media sensibly.
AP





with the Korea JoongAng Daily
To write comments, please log in to one of the accounts.
Standards Board Policy (0/250자)