Morning Overview

EU charges Meta, says Facebook and Instagram must better block under-13s

The European Commission has formally accused Meta of failing to keep children under 13 off Facebook and Instagram, issuing preliminary charges in April 2026 that represent the sharpest enforcement action yet under the EU’s Digital Services Act. If the findings stick, Meta could face fines of up to 6% of its global annual revenue, a penalty that based on the company’s 2025 earnings could reach billions of euros and send a warning to every major social media company operating in Europe.

What the Commission found

The charges center on a specific obligation under the DSA: very large online platforms must identify, assess, and mitigate systemic risks, especially those affecting minors. According to the Commission’s preliminary findings, Meta fell short at the most basic level. The company did not carry out a rigorous enough assessment of how easily children under 13 could bypass its age gates on Instagram and Facebook, and it did not act decisively on the risks it did identify.

Commission Executive Vice President Henna Virkkunen did not mince words. Platforms are “doing very little” to protect young users, she said in remarks reported by the Associated Press. That framing suggests Brussels views Meta’s existing safeguards not as a good-faith effort that needs refinement, but as fundamentally inadequate.

The accusation did not come out of nowhere. The Commission opened formal proceedings against Meta in 2024, zeroing in on the company’s age-assurance methods. Before that, it sent a detailed request for information demanding specifics on how Instagram screens for underage users. Nearly two years of investigation led to the current charges.

Why existing safeguards weren’t enough

Meta has not been entirely passive on teen safety. The company rolled out “Teen Accounts” on Instagram in 2024, which placed users under 16 into restricted settings by default, limiting who can message them and what content surfaces in recommendations. Facebook and Instagram both require users to enter a birthdate at sign-up.

But birthdate entry is trivially easy to fake, and the Commission’s investigation appears to have concluded that Meta knew this without doing enough about it. The DSA does not require platforms to achieve a perfect block on underage access. It requires them to take the problem seriously in a documented, systematic way: mapping how their algorithms, ad-targeting tools, and recommendation systems might expose children to harm, then deploying proportionate countermeasures. The Commission’s preliminary view is that Meta’s process did not meet that bar.

The charge also reflects a broader frustration among European regulators. Studies from national authorities, including France’s CNIL and Ireland’s Data Protection Commission, have repeatedly flagged that children well below 13 maintain active social media profiles despite platform rules. While no EU-wide quantitative study was published alongside these specific charges, the pattern is well documented enough that the Commission treated it as a known systemic risk Meta should have addressed more aggressively.

What happens next

Meta now has the right to respond to the preliminary findings before any final decision. The company can submit evidence, propose remedies, or challenge the Commission’s legal reasoning outright. This procedural window could also lead to negotiated commitments, where Meta agrees to specific changes in exchange for a defined outcome, potentially avoiding the maximum fine.

No timeline for a final ruling has been set. If the Commission does move to a binding decision, the DSA allows penalties of up to 6% of a platform’s worldwide annual turnover, with the exact figure depending on the severity and duration of the breach. Beyond fines, the Commission could mandate specific technical changes, though it has not publicly endorsed any single verification method, whether biometric age estimation, government-ID checks, or something else.

Meta has not released a detailed public response addressing the substance of the charges. Until it does, the picture remains one-sided: the Commission’s allegations are on the record, but the company’s internal data on how its age-verification tools perform, and what improvements may already be underway, is not.

The global ripple effect

The DSA applies only within the EU, but the case carries weight far beyond Brussels. Any age-verification system Meta builds to satisfy European regulators could become a global default, simply because maintaining separate technical architectures for different regions is expensive and complex. Alternatively, Meta could deploy region-specific tools, creating a patchwork where European users face stricter checks than those elsewhere.

Other regulators are watching closely. In the United States, the Children’s Online Privacy Protection Act (COPPA) already restricts data collection from children under 13, and Congress has debated expanding protections through proposals like the Kids Online Safety Act. Australia passed legislation in late 2024 banning social media access for children under 16 entirely. The EU’s approach sits between these poles: it does not ban children from platforms outright but demands that companies prove they are managing the risks.

Within Europe, national authorities could open parallel investigations under local child-protection or consumer laws, drawing on the Commission’s findings as a foundation. No such actions have been formally announced in connection with this case, but the precedent is being set for a more aggressive regulatory posture across the bloc.

The Commission’s charges also land at a moment when Meta is not the only platform under DSA scrutiny. The EU has pursued enforcement actions against X (formerly Twitter) over content moderation and transparency obligations, establishing a pattern that suggests Brussels intends to use the DSA as a routine enforcement tool rather than a symbolic one.

What this means for parents and platforms

For parents across Europe, the immediate practical impact is limited. Facebook and Instagram will continue operating while the case proceeds, and no new age-verification requirements take effect until a final decision is reached. But the direction of travel is clear: the EU expects social media companies to do more than ask users to type in a birthdate and hope for honesty.

For the tech industry, the stakes extend well beyond one company. If the Commission’s interpretation of the DSA holds, every very large online platform, including TikTok, YouTube, and Snapchat, will face the same standard: prove you have systematically assessed and acted on the risk of underage access, or face penalties scaled to your global revenue. That framework turns child safety from a reputational concern into a financial one, measured in billions rather than headlines.

The preliminary charges against Meta are exactly that: preliminary. Meta has not been found in violation, and the company retains every procedural right to contest the findings. But the Commission has laid down a marker. The question is no longer whether the EU will enforce the DSA’s child-protection provisions aggressively. It is whether Meta, and the rest of the industry, can move fast enough to meet the standard Brussels is now setting.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.