ravinepz/Unsplash

Character AI is redrawing the line between playful experimentation and unfiltered AI companionship for young users. Instead of letting kids and teenagers roam through open-ended chats with any chatbot, the company is steering them into curated, interactive stories that it argues are safer and more predictable.

That shift reflects a broader reckoning over how generative AI should look when the user is not an adult, and it puts Character AI at the center of a debate about whether guardrails should limit expression or simply guide it. I see the company’s new approach as an attempt to keep the creative spark of roleplay alive while closing off the riskiest corners of its platform for under‑18s.

From freeform chat to guided adventures

The core change is simple but sweeping: under‑18 users will no longer be able to hold open, anything-goes conversations with Character AI’s chatbots, and will instead be funneled into structured, story-like experiences. Reporting on the company’s latest product update describes a new kids’ mode built around interactive narratives, where young users choose paths and respond to prompts inside a controlled storyline rather than improvising with an AI companion in an unbounded chat window, a shift that is framed as a direct replacement for the previous open chat experience for minors and is detailed in coverage of the new interactive stories.

In parallel, Character AI has publicly outlined a reworked under‑18 experience that separates adult-style chatbot interactions from what children and teens can access, describing a dedicated environment with age-appropriate content, stricter filters, and a narrower set of AI behaviors. In its own product communication, the company explains that accounts identified as belonging to minors will be moved into this redesigned experience, which is described as a distinct space with its own rules and is laid out in an update on changes to how the platform handles younger users.

Why teens are losing open-ended AI companions

For teenagers, the most jarring part of the overhaul is the loss of open-ended conversations with AI characters that many had treated as confidants, creative partners, or just late-night entertainment. Coverage of the change notes that Character AI is explicitly telling teen users that they will no longer be able to maintain freeform chats with AI companions, positioning the move as a safety measure that cuts off access to bots that can roleplay romance, discuss mature themes, or respond in unpredictable ways, a message that is spelled out in reporting on the company’s notice to teens losing open-ended chats.

The clampdown is not limited to a few edge cases, and some coverage describes it as a blanket rule that teenagers will not be allowed to interact with the platform’s chatbots in the way adults do. One report states plainly that Character AI will not allow teenagers to interact with its chatbots, framing the decision as a response to mounting scrutiny over sexually explicit roleplay, mental health conversations, and other sensitive topics that can surface in unmoderated AI exchanges, a stance that is captured in a piece on the company’s plan to not allow teenagers to use its chatbots in the traditional way.

How the new kids’ experience is supposed to work

Character AI is not pulling out of the youth market altogether, and the company’s own materials describe a redesigned experience that is meant to be safer by construction rather than by constant moderation. In its explanation of the under‑18 changes, the company says it is building a specific environment for younger users that relies on curated content, stronger filters, and product design choices that limit what kinds of conversations can unfold, presenting this as a way to keep minors on the platform without exposing them to the full range of adult interactions, a strategy it outlines in detail in its description of teen safety priorities.

Outside observers who specialize in online safety for families have been dissecting how Character AI works and what parents should expect from the new setup, often stressing that even a more guided experience still requires oversight. One parental control provider’s review walks through the app’s features, content filters, and risks, concluding that while the platform offers creative and educational possibilities, it also carries exposure to inappropriate material if left unchecked, and it recommends using device-level controls and in-app settings to manage access, advice laid out in a detailed review for parents of Character AI.

Safety concerns that pushed Character AI to tighten controls

The pivot toward guided stories and restricted teen access did not happen in a vacuum, and it reflects months of concern from child-safety advocates, schools, and parents about what minors encounter in AI chats. One widely cited safety guide for families notes that Character AI can generate content that is sexual, violent, or otherwise unsuitable for children, and it warns that the platform is not inherently designed for young users, urging parents to treat it as an adult-oriented service and to supervise any use closely, a caution that appears in an online safety overview of the app.

Other child-safety organizations have gone further, explicitly questioning whether Character AI is safe for kids at all and pointing to examples where filters failed to block mature or disturbing content. One analysis aimed at parents explains that the service can simulate realistic conversations, including roleplay and emotional support, and that this can blur boundaries for children who may not fully grasp that they are talking to an AI, concluding that the platform is not recommended for younger kids and should be approached with caution even for teens, a position spelled out in a detailed piece asking if Character AI is safe for children.

What interactive stories actually look like in practice

To understand what kids will see instead of open chat, it helps to look at how Character AI already structures some of its narrative experiences. On the platform, users can enter chats that function more like choose-your-own-adventure games than like freeform messaging, with the AI presenting scenarios, branching choices, and guided dialogue that keep the interaction within a defined frame, an approach that is visible in one of the service’s public interactive story chats where the conversation unfolds as a structured narrative.

That format gives Character AI more control over tone and content, since the AI is effectively playing the role of a game master inside a predesigned world rather than improvising across any topic a user might raise. In theory, it also makes it easier to align with age ratings and parental expectations, because the company can vet specific storylines and characters instead of trying to police every possible prompt, a distinction that becomes clear when comparing the guided flow of these narrative chats with the open-ended roleplay that has drawn scrutiny in independent video walkthroughs of the app’s capabilities.

What parents and guardians need to know now

For families, the shift to guided stories does not eliminate the need for active involvement, but it does change what responsible use looks like. Parent-focused explainers emphasize that Character AI is not a simple toy and that even with new restrictions, adults should review how the app is used, talk with children about what they see, and consider using external tools such as Qustodio, Mobicip, or built-in device controls to set boundaries, guidance that is echoed in a comprehensive parental review that walks through practical steps for managing access.

Some safety guides also suggest that parents treat Character AI as one option among many, rather than the default place for kids to explore AI, and they encourage steering younger children toward explicitly child-focused platforms or school-approved tools. One such guide explains that while Character AI can be used for creative writing, language practice, or roleplay, it is still primarily an adult service and should be approached with caution for minors, a framing that appears in an independent overview that repeatedly stresses supervision and age-appropriate alternatives.

Character AI’s bet on a curated future for young users

Character AI is effectively betting that it can keep minors on its platform by offering a more curated, game-like experience instead of the unbounded AI companionship that helped the service explode in popularity. In its own messaging, the company argues that this approach lets it prioritize teen safety while still giving younger users a way to experiment with generative AI, a balance it describes in its statement on prioritizing safety where it frames the new under‑18 experience as a proactive response to evolving expectations from regulators and families.

Whether that bet pays off will depend on how compelling the interactive stories feel to kids who are used to the freedom of texting any character they can imagine, and on whether parents see enough evidence that the new guardrails actually work. Early coverage of the shift toward story-driven experiences suggests that Character AI is trying to position itself as a responsible player in a crowded field of AI chat apps, a move that aligns with the detailed explanation of its new under‑18 experience and with the broader trend of platforms tightening controls when children are involved.

More from MorningOverview