Roblox, the gaming platform used by more than 80 million people daily, now requires players to verify their age through a video selfie before they can use chat features. The company has also sorted younger players into age-specific account tiers and expanded parental oversight tools, changes that affect a user base heavily concentrated among children under 13.
The rollout, which began in early 2026, marks one of the most aggressive moves by a major gaming platform to restrict how minors interact online. It arrives as federal regulators and child safety groups maintain pressure on tech companies to do more, and after Roblox itself settled with the FTC in 2023 over children’s privacy violations on the platform.
How the new system works
The core change is structural, not cosmetic. Roblox now places users into age-based account categories with distinct restrictions for those under 13, those between 13 and 17, and adults. To unlock chat, a user must record a brief video selfie that powers facial age estimation. Without completing that step, chat stays locked.
The technology behind the check comes from Persona, a third-party identity verification company also used by firms like Uber and Square. According to reporting from the Associated Press, Roblox says the video selfies are deleted immediately after processing. The system does not ask children for a government-issued ID. Instead, it analyzes facial features to estimate whether a user falls above or below certain age thresholds, then flags the account with the appropriate tier.
For a platform built around user-generated worlds where social interaction drives engagement, limiting who can talk to whom reshapes the daily experience for millions of players. Younger users are now grouped into age-appropriate chat environments rather than sharing a single open communication layer with adults and older teens.
What parents can actually do now
Roblox has expanded its parental controls for accounts belonging to users under 16. A new dashboard gives parents and guardians direct tools to manage activity, friend requests, and time spent on the platform. These controls go beyond what Roblox previously offered, a system that child safety advocates had criticized as too limited for a service so popular with elementary and middle school-age children.
The first practical step for any parent with a child on Roblox: check whether the account has been updated to the new age-based tier and explore the parental control dashboard. If a child has not yet completed the age verification selfie, chat will remain restricted until they do. Parents who want to keep chat disabled entirely can enforce that choice through the new controls regardless of verification status.
Families may also want to talk through what the verification means before a child records the selfie. The video is used to estimate age, not to create a permanent profile. For some households, the promise of safer chat will outweigh discomfort with facial analysis. Others may prefer to leave chat locked and skip the process altogether.
The accuracy and privacy questions
Facial age estimation is not a settled technology. The UK’s Information Commissioner’s Office and Ofcom have both examined similar systems and found that accuracy can vary, particularly for younger adolescents whose facial features change rapidly. Roblox has not publicly shared specific metrics on how accurate Persona’s estimation is for its user base, or what happens when the system misjudges a user’s age. Whether the company has built an appeals process for users placed in the wrong tier has not been confirmed in available reporting.
Privacy is the other open question. Roblox says videos are erased after processing, but no independent audit of Persona’s deletion practices has surfaced publicly. For parents weighing the tradeoff between safety and data collection, the company’s assurance is currently the only evidence on the table.
There is also an obvious circumvention problem that the company has not addressed in its public statements: nothing in the reported system prevents an older sibling or friend from completing the selfie on behalf of a younger child, potentially placing that child in a less restricted tier than intended.
Regulatory pressure behind the scenes
Roblox has framed these changes as part of its commitment to child safety, but the timing is hard to separate from the regulatory environment. The Kids Online Safety Act passed the U.S. Senate in 2024 with broad bipartisan support before stalling in the House, and versions of the bill continue to circulate in Congress. Several states have passed or proposed their own age verification laws for online platforms. Meanwhile, the FTC has signaled ongoing interest in how platforms handle children’s data.
Whether Roblox is responding to a specific regulatory deadline or acting preemptively to get ahead of expected rules is not clear from the public record. But the company’s 2023 FTC settlement, which involved allegations that it collected personal information from children without proper parental consent, provides obvious motivation to demonstrate progress on safety before regulators force the issue.
What age tiers alone cannot fix
Separating younger players from adults in chat may reduce certain risks, particularly direct contact from unknown adults. But age-based tiers do not by themselves address bullying among peers, exposure to inappropriate content created by other minors, or the broader moderation challenges that come with a platform hosting millions of user-generated experiences.
Roblox is not the only company wrestling with this. Epic Games has introduced its own age verification measures for Fortnite, and Meta launched supervised teen accounts across Instagram and Facebook in 2024. Each approach has drawn both praise for taking action and criticism for not going far enough. The common thread is that no platform has yet demonstrated a system that reliably keeps minors safe without raising significant privacy or accuracy concerns.
For families, the practical reality as of spring 2026 is this: Roblox has clearly moved to tighten controls around how children communicate on its platform, and it has tied those controls to a new form of age estimation. The benefits, including reduced contact with adults, clearer parental oversight, and age-tailored chat spaces, are concrete. So are the gaps in what the company has disclosed about accuracy, privacy verification, and long-term enforcement. Until independent testing and audit data emerge, parents will have to weigh what Roblox has publicly promised against the questions that remain unanswered.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.