The European Commission has taken its most direct shot yet at the way TikTok keeps users glued to their screens, issuing preliminary findings that the platform’s core design features violate the Digital Services Act (DSA). The ruling targets specific mechanics, including infinite scroll, autoplay, and push notifications, framing them not as neutral product choices but as systemic risks that TikTok failed to assess or mitigate. For the hundreds of millions of people who open the app daily across Europe, this is no longer an abstract regulatory debate; it is a concrete enforcement action that could force the platform to redesign how its feed works or pay steep financial penalties.
At the heart of the dispute is a shift in how Brussels understands platform responsibility. Rather than focusing only on illegal content or data protection, the DSA pushes very large online platforms to examine how their own algorithms and interface choices shape user behavior and well-being. The European Commission is now arguing that TikTok’s engagement-maximizing design crosses a line: not only did the company allegedly fail to identify and reduce the risks created by these features, but the very architecture of its service may be incompatible with the duties that come with operating at continental scale.
What the Commission Actually Found
The Commission’s preliminary view, announced on February 6, 2026, is that TikTok’s product architecture amounts to “addictive design” that breaches the DSA’s requirement for platforms to identify and reduce systemic risks to users. In its official communication, the Commission explicitly names infinite scroll, autoplay, push notifications, and a highly personalized recommender system as the features under scrutiny. The allegation is not simply that these mechanics exist, but that TikTok failed to adequately assess and mitigate the risks they pose, particularly to younger users. A related notice on the Commission’s press corner emphasizes concerns around safety and security for minors, placing children at the center of the enforcement narrative.
This distinction is crucial. The DSA, formally set out in Regulation (EU) 2022/2065, does not prohibit infinite scroll or personalized feeds as such. Instead, it obliges very large platforms to conduct robust risk assessments of their design choices and then implement effective measures to reduce any harms those designs create. The law also addresses manipulative interface patterns, often referred to as dark patterns, that can steer users in ways they do not fully understand. The Commission’s case against TikTok rests on the argument that the company’s mitigations (such as time limits, prompts, or parental controls) were inadequate relative to the scale and nature of the risks. That framing flips the burden: TikTok must now demonstrate that its safeguards are proportionate and effective, rather than forcing regulators to prove direct causation between specific design elements and concrete harm.
A Trail of Enforcement Actions
The February 2026 preliminary findings did not emerge from nowhere. The Commission had already opened formal proceedings against TikTok under the DSA, defining an investigative scope that went well beyond addictive design alone. That earlier step set out four main areas of concern: risk management around engagement-driven features, protection of minors, transparency in advertising, and access to platform data for independent researchers. Each strand has fed into the broader picture now taking shape, providing legal and factual groundwork for the new findings. The formal proceedings also triggered procedural rights and obligations on both sides, making it clear that this would be a test case for how the DSA is enforced against a global social media giant.
In parallel, Brussels has targeted specific TikTok experiments as they rolled out. When the company introduced TikTok Lite’s reward program in France and Spain, the Commission swiftly opened proceedings focused on that feature, signaling an intention to suspend it within the EU. Regulators argued that tying rewards to engagement could exacerbate compulsive use, especially among younger people, and they were willing to intervene before long-term evidence accumulated. This pattern, moving quickly against discrete features while building a more comprehensive case, illustrates a strategic approach. The Commission is effectively testing its legal theories on narrower issues, then scaling them up to challenge the overall architecture of a platform that, in its view, has normalized attention-maximizing design.
The Science Behind the Scroll
The Commission’s depiction of TikTok’s feed as addictive is bolstered, albeit indirectly, by emerging research on short-form video and cognition. A preprint posted on arXiv, titled “Short-Form Videos Degrade Our Capacity to Retain Intentions: Effect of Context Switching On Prospective Memory,” offers experimental evidence that the rapid context switching typical of feeds like TikTok’s can impair prospective memory, the ability to remember to carry out intended actions. In the study, participants exposed to sequences of quickly changing, unrelated clips showed reduced performance on tasks requiring them to remember future intentions, compared with those in more stable viewing conditions. While the research does not single out TikTok by name as a legal target, it describes a pattern of interaction that closely resembles the endless scroll of short videos.
There are important caveats. As a preprint, the paper has not yet undergone full peer review, and its findings should be treated as provisional. Laboratory experiments also cannot capture the full complexity of everyday media use, where people switch between apps, devices, and offline activities. Still, the results align with a broader concern surfacing in policy debates: that feeds optimized for engagement may impose cognitive and psychological costs that users neither anticipate nor fully consent to. This is precisely the gap the DSA’s systemic risk provisions are meant to address. If a platform’s internal risk assessments downplay or ignore such emerging evidence, the Commission is more likely to conclude that its mitigations fall short of what the law requires. Even uncertain science can thus play a role, not as definitive proof of harm, but as a signal that regulators should demand more rigorous self-scrutiny from very large platforms.
What TikTok Faces Next
The preliminary findings are not a final verdict, but they mark a decisive escalation. TikTok now has the opportunity to respond, submit evidence, and propose changes before the Commission adopts a final decision. According to reporting from AP, EU officials have made it clear that the company must modify what they describe as addictive design patterns or risk substantial fines under the DSA’s penalty framework. The focus, as described in that coverage, is squarely on protecting children and other vulnerable users from design choices that may encourage excessive or compulsive use. TikTok, for its part, is expected to argue that it already offers tools such as screen time limits, content filters, and parental controls, and that users retain meaningful agency over how they interact with the app.
The practical question is what “adequate mitigation” looks like for a service whose core appeal lies in frictionless consumption. Regulators could push for stronger default limits on usage for minors, more prominent and frequent break reminders, or options to disable autoplay and infinite scroll by default. They might also demand greater transparency around how the recommendation system prioritizes content, and clearer, less manipulative interfaces for adjusting those settings. Any such remedies would have to be calibrated against the DSA’s broader principles, which aim to reduce systemic risks without dictating product design in minute detail. Whatever the outcome, the case is poised to become a reference point for how Europe applies its new digital rulebook to attention-driven platforms, and for how far lawmakers are willing to go in reshaping the incentives that underpin the modern social media economy.
A Test Case for the DSA Era
Beyond TikTok itself, the Commission’s move signals how it intends to wield the DSA as a structural tool rather than a narrow content-policing regime. By centering design choices like infinite scroll and autoplay, Brussels is asserting that the architecture of online services can be regulated when it creates systemic risks, especially for minors. Other very large platforms that rely on similar engagement mechanics will be watching closely. If the Commission succeeds in forcing meaningful redesigns here, it could embolden regulators to scrutinize everything from recommendation algorithms on video platforms to notification strategies in messaging apps. The law’s emphasis on risk assessment and mitigation gives authorities a framework to ask not just “What content do you host?” but “How does your design shape behavior and harm?”
The outcome will also help define the balance of power between global tech firms and European regulators. The European Commission has positioned itself as a central enforcer of digital rules, and high-profile cases like this one test its ability to translate ambitious legislation into concrete changes on users’ screens. If TikTok agrees to or is compelled to alter its core engagement features, it may set a precedent for negotiated design changes under regulatory pressure. If, instead, the company opts for a more confrontational path, the dispute could end up before the EU courts, where judges would have the final say on how far the DSA reaches into product design. Either way, the proceedings underscore that in Europe’s new regulatory landscape, the way a platform keeps people scrolling is no longer just a business decision. It is a matter of law.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.