Morning Overview

Australia orders Roblox, Minecraft, Fortnite and Steam to detail child safety

Australia’s online safety regulator has put four of the world’s biggest gaming platforms on notice, demanding they explain exactly what they do to protect children from grooming, sexual extortion, radicalisation, cyberbullying, and online hate. The eSafety Commissioner on April 22 issued legally enforceable transparency notices to Roblox, Minecraft, Fortnite, and Steam, marking the first time the regulator has used its formal powers to target major gaming services. For millions of young Australians who spend hours each week inside these virtual worlds, the move could expose safety gaps that have stayed largely invisible to parents and lawmakers.

What the regulator is demanding

The notices were issued under the Basic Online Safety Expectations (BOSE) framework, created by the Online Safety Act 2021. That framework requires online services to take “reasonable steps” to keep users safe, with heightened obligations when children are involved. Each of the four platforms must now detail how its systems, staffing levels, and safety-by-design features measure up against those expectations.

The five categories of harm are specific and serious: grooming, sexual extortion, youth radicalisation, cyberbullying, and online hate. These reflect dangers that gaming environments, with their real-time text chat, voice communication, and oceans of user-generated content, can amplify for younger players. The notices are not requests. Under Section 56 of the Online Safety Act, the Commissioner can compel companies to hand over operational data outside any regular reporting schedule, covering complaints handling, response times, moderation performance metrics, and the number of active Australian users.

In its media statement, the eSafety Commissioner said the four services were chosen because of their scale and the central role they play in the online lives of Australian children and teenagers. The regulator stressed that multiplayer games have become social networks in their own right, blending chat, voice, and in-game economies in ways that can expose young players to predatory behaviour and organised abuse.

Why gaming platforms, and why now

The notices arrive at a moment when Australian regulators are tightening the screws on the entire tech sector over child safety. In late 2024, Parliament passed the Social Media Minimum Age Act, which will bar children under 16 from holding social media accounts once enforcement begins. The gaming transparency notices extend that protective instinct into interactive entertainment, a category that has historically faced less scrutiny than platforms like Instagram or TikTok despite hosting comparable social features.

There is also an international backdrop. Roblox Corporation agreed to pay a US$10 million fine to the U.S. Federal Trade Commission in 2024 to settle allegations it violated the Children’s Online Privacy Protection Act (COPPA). Epic Games, maker of Fortnite, paid US$520 million in 2022 to resolve FTC complaints about children’s privacy and deceptive purchase practices. Those cases demonstrated that gaming companies can fall short on child protection even when they publicly champion safety. Australia’s regulator appears to be signalling that it will not wait for harm to surface in court before asking hard questions.

The action also sits within a broader regulatory architecture. Australia maintains a register of Online Safety Codes and Standards covering unlawful material across different service types. The transparency notices to gaming platforms draw on the same statutory foundation, pushing the regulator’s reach deeper into interactive entertainment.

What the platforms have not said

As of April 23, none of the four companies have publicly responded. Without statements from Roblox Corporation, Microsoft (which owns Minecraft), Epic Games, or Valve Corporation (Steam), it is unclear how each platform plans to comply, what timeline they will follow, or whether any will push back on the scope of the demands.

The eSafety Commissioner has not published specific data on the scale of grooming or radicalisation incidents within these four games in Australia. The categories of harm are clearly defined, but the regulator has not disclosed case counts or complaint volumes tied to these platforms. That makes it difficult to judge whether the action responds to a measurable spike in harm or reflects a broader preventive strategy aimed at high-risk environments.

Enforcement consequences also remain vague. The Online Safety Act contains compliance mechanisms, and civil penalties for corporations can reach hundreds of thousands of dollars per contravention under current penalty unit calculations. But the regulator has not detailed what escalation steps would follow if a platform’s response is judged inadequate. Until eSafety outlines potential sanctions, the deterrent effect of the notices is hard to gauge.

Another open question is how granular the companies’ disclosures will be. The law allows the Commissioner to seek detailed metrics, but companies often argue that revealing specific moderation thresholds or detection tools could expose trade secrets or help bad actors evade enforcement. Whether Roblox, Microsoft, Epic, or Valve will provide high-level summaries, deeply technical data, or something in between remains to be seen, as does how much material the regulator will eventually make public.

What this means for families and policymakers

For Australian parents and caregivers, the notices represent a concrete step: regulators recognise the risks children face in online games and are willing to compel answers from powerful companies. But recognition is not the same as resolution. There is no public evidence yet that previous BOSE reporting exercises directed at social media and messaging services have led to measurable reductions in abuse or faster response times. The track record of transparency as a tool for driving real change in platform behaviour is still being written.

Until more data emerges, families are likely to remain reliant on their own safeguards: supervising younger children’s play, using in-game privacy and blocking tools, and talking regularly with kids about who they interact with online.

Policymakers and child safety advocates will be watching for three things once the companies respond. First, whether the disclosures reveal previously hidden safety weaknesses. Second, whether eSafety is prepared to challenge vague or incomplete answers. And third, whether the regulator publishes enough information for independent experts to scrutinise both the platforms and the regulatory process itself. Only when those pieces are visible will it become clear whether Australia’s latest assertion of online safety powers marks a genuine turning point for child protection in gaming, or another layer of process on an already complex digital landscape.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.