British Prime Minister Keir Starmer warned Instagram, TikTok, and other social media platforms in late March 2026 that his government will “have to act” against features he says are engineered to keep children glued to their screens. In a first-person Substack post reported by the Associated Press, Starmer called out infinite scrolling and streaks by name, describing them as “addictive features” that demand regulation rather than voluntary reform.
“No option is off the table,” Starmer wrote, adding that his government is examining how to shield children from compulsive platform design. That language puts the U.K. on a path that could lead to an outright ban on social media accounts for users under 16, similar to legislation Australia passed in late 2024, or to a more targeted approach that strips specific algorithmic features from young people’s feeds while leaving basic access intact.
What Starmer is proposing
The prime minister’s post moved the U.K. debate beyond general hand-wringing about screen time and into specific territory: the mechanics that make platforms sticky. Infinite scrolling, the design choice that eliminates natural stopping points in a feed, and streaks, the reward systems that pressure users to log in daily, were his two headline targets. Both are core engagement tools for platforms like TikTok, Instagram, and Snapchat, and both have drawn scrutiny from child-safety researchers who argue they exploit developing brains’ sensitivity to variable rewards.
Starmer also referenced the need to regulate algorithms themselves, suggesting any new rules could go beyond simple age gates to address how platforms curate and serve content to younger audiences. That distinction matters. Age verification asks “How old are you?” Algorithm regulation asks “What is this platform doing to your attention once you’re inside?” The second question is harder to enforce but potentially more consequential.
His comments arrived alongside two parallel developments. A group of Labour backbenchers sent a letter to Downing Street calling for a full ban on social media access for anyone under 16, adding internal party pressure to the government’s timeline. Separately, U.K. ministers have been dispatched to Australia to study that country’s youth social media restrictions, which prohibit children under 16 from holding accounts on major platforms and are expected to take full effect later this year.
Where the Online Safety Act fits in
The U.K. is not starting from scratch. The Online Safety Act, passed in 2023, already gives the communications regulator Ofcom broad powers to hold platforms accountable for protecting children from harmful content. Ofcom has been rolling out enforceable codes of practice under that law, requiring platforms to conduct risk assessments and implement age-appropriate safety measures.
But the Online Safety Act was primarily designed around content, not design. It targets what children see, not how the platform keeps them watching. Starmer’s focus on infinite scrolling and algorithmic feeds signals that his government views the existing framework as necessary but insufficient. Any new regulation would likely sit alongside the Online Safety Act rather than replace it, filling what ministers appear to see as a gap between content moderation and addictive product design.
The government has launched a formal consultation on possible restrictions and is running pilot programs that limit certain features among teenage users. Details on those pilots, including sample sizes, duration, and what exactly is being switched off, have not been made public. Whether they measure mental health outcomes, academic performance, time on platform, or some combination remains unclear.
How platforms and experts have responded
Neither Meta, which owns Instagram, nor ByteDance, TikTok’s parent company, had issued a public response to Starmer’s statements as of early April 2026. Snapchat, whose streaks feature is among the most prominent examples of the mechanic Starmer criticized, has also stayed quiet. The silence is notable but not unusual; platforms facing regulatory pressure in one country often wait for formal proposals before engaging publicly, preferring behind-the-scenes lobbying to headline-grabbing confrontations.
Past industry responses to similar pressure offer some guide to what may come. When Australia moved toward its under-16 ban, Meta and TikTok argued that age verification technology was unreliable and that bans would push young users toward less regulated corners of the internet. In the European Union, where the Digital Services Act imposes transparency requirements on algorithmic recommendation systems, platforms have introduced opt-out tools and adjusted default settings for minors rather than face blanket restrictions.
Child-safety organizations in the U.K., including the NSPCC and 5Rights Foundation, have long called for stronger regulation of addictive design features aimed at children. Baroness Beeban Kidron, who championed the U.K.’s Age Appropriate Design Code, has argued that platforms knowingly deploy psychological techniques to maximize engagement among young users and that voluntary commitments have repeatedly fallen short. That advocacy provides political cover for Starmer’s push, even as the specific evidence base for the U.K. pilots remains unpublished.
The Australian model and its limits
Australia’s Social Media Minimum Age Act, passed in November 2024, set 16 as the minimum age for holding an account on designated platforms. It placed the burden of compliance on the platforms rather than on parents or children, a design choice intended to avoid criminalizing young users. But the law has faced questions about enforcement. Age-verification technology remains imperfect, and critics have pointed out that determined teenagers can circumvent checks using VPNs, borrowed credentials, or false birthdates.
The U.K. delegation studying Australia’s approach will need to grapple with a fundamental difference in strategy. Australia’s law is account-level: you either have access or you don’t. Starmer’s language points toward feature-level regulation: you can have an account, but certain design elements are switched off if you’re under a given age. That is a more granular approach, and potentially more technically demanding. It requires platforms not just to verify age at sign-up but to maintain different product experiences for different user segments on an ongoing basis.
Whether that kind of tiered design is technically feasible at scale, and whether platforms would implement it faithfully rather than in a token fashion, are open questions. No country has yet attempted feature-level social media regulation for minors, which means the U.K. would be charting new territory if it follows through on Starmer’s rhetoric.
What comes next
For families with children on these platforms, nothing changes immediately. No new rules are in effect, and no compliance deadline has been set. The consultation process will include a public comment period before any regulation takes shape, and the government has not announced a timeline for publishing draft legislation or regulatory guidance.
The milestones to watch are the publication of any pilot findings, the government’s formal consultation response, and the first draft of legislation or Ofcom guidance that translates Starmer’s rhetoric into enforceable obligations. Until those arrive, parents and educators are left with existing tools: built-in screen-time limits, parental controls, and direct conversations about online habits.
The broader stakes extend well beyond the U.K. Starmer’s language about algorithms and addictive design echoes regulatory moves in the E.U., Australia, and several U.S. states. If the U.K. follows through with feature-level restrictions, platforms operating globally could face a patchwork of incompatible rules, each demanding different default settings for young users in different jurisdictions. That compliance burden, not the rhetoric of any single leader, is what may ultimately force Instagram and TikTok to redesign how their core products work for minors everywhere.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.