Morning Overview

Meta & YouTube accused of hooking kids on endless screen addiction

In a Los Angeles courtroom, the mechanics of modern childhood are on trial. Lawyers for a California teenager argue that Instagram and YouTube were built like behavioral slot machines, tuned to keep kids swiping and watching long past the point of choice, and that this design helped drive her into depression and suicidal thoughts. The companies insist they never set out to harm anyone, but the case is already reframing social media not as a neutral tool, but as a product whose core features may function like an addictive drug for young users.

At stake is far more than one teenager’s story. The lawsuit is a bellwether for thousands of similar claims across the United States that accuse Meta and YouTube of knowingly exploiting children’s vulnerabilities to maximize engagement and profit. If jurors accept that premise, the verdict could force a redesign of the attention economy itself, from autoplay and infinite scroll to the algorithms that decide what kids see first when they wake up and last before they fall asleep.

The bellwether case putting design on trial

The California case centers on a young woman who says she began using Instagram and YouTube in elementary school and eventually spent up to seven hours a day on the platforms. Her lawyers argue that the apps were engineered to capture “very young children,” with recommendation systems that learned her fears and insecurities and then fed them back in a constant loop. According to the complaint, that loop coincided with escalating depression, self-harm and suicidal ideation, turning what looked like ordinary screen time into something closer to compulsive use.

Opening arguments in LOS ANGELES describe this lawsuit as a test of whether social media companies can be held liable for the way their products are designed, not just for individual posts that appear on them. The case is one of several landmark trials beginning this year that seek to hold the world’s biggest platforms responsible for alleged youth addiction and mental health harms, and it follows earlier litigation against TikTok that was reportedly settled for undisclosed sums without reaching a jury, a sequence that raises the stakes for how courts will now treat Meta and YouTube in particular landmark trials.

How plaintiffs say the “slot machine” works

To understand the accusation that these apps “hook” kids, it helps to look at the mechanics that lawyers are putting under the microscope. The complaint describes a system of infinite scroll, autoplay and algorithmic recommendations that together create a frictionless path from one piece of content to the next, with no natural stopping point. For a child still learning self-regulation, that design can feel less like browsing and more like being carried along by a current that is very hard to swim against.

In court filings, the California case is framed as part of a broader set of Landmark proceedings that focus on how these design choices intersect with adolescent psychology, particularly for users who report depression and suicidal thoughts. One case in California zeroes in on addiction, while another emphasizes how social media may amplify body image issues and self-harm content, suggesting that the same engagement tools can have different but equally serious downstream effects on young people’s mental health California case.

Meta and YouTube’s defense: intent, responsibility and labels

Meta’s response is to draw a sharp line between correlation and intent. In opening statements, attorney Schmidt argued that the core question is not whether some teens struggle with mental health while using Instagram, but whether the company intentionally designed its products to cause addiction or knew that its features would do so and pressed ahead anyway. Meta denies the civil charges and accuses the plaintiff’s team, led by attorney Torrez, of cherry-picking internal documents and making “sensationalist” claims that ignore the broader context of how billions of people use its services without incident Meta pushes back.

YouTube is taking a different tack, trying to win the argument before jurors even reach the question of harm. Its lawyers say YouTube is not social media at all, but a video platform closer to television, with content that is often educational or “socially useful.” In opening arguments, attorney Li emphasized that YouTube does not have the same friend and follower dynamics as other networks and that the company has invested in tools like YouTube Kids and supervised accounts to give parents more control, a framing that aims to distance the service from the broader backlash against social apps even as it faces the same allegations in court YouTube says.

The teenager at the center of the storm

Behind the sweeping rhetoric about algorithms and addiction is a specific teenager, Kaley, whose daily life is being dissected in granular detail. Lawyers say that at some points she used YouTube for “six to seven hours a day,” often late into the night, and that her Instagram use was similarly intense. They argue that this was not simply a matter of poor self-control, but the predictable outcome of systems that constantly nudged her toward more emotionally charged content and rewarded her with social validation when she complied.

During jury selection, Li acknowledged that Kaley’s usage was heavy but suggested that her struggles stemmed from preexisting vulnerabilities and offline factors rather than anything unique about the platforms. The defense is expected to highlight gaps in her medical history and to question whether any clinician ever diagnosed her with addiction or treated her specifically for social media–related harm, a strategy that aims to reframe the case as a tragic personal story rather than proof of a systemic design failure Kaley claims.

From one lawsuit to a systemic reckoning

This single trial is part of a much larger legal wave. Earlier this year, Meta and YouTube headed to court over harm to children after TikTok reached its own settlement, positioning these companies as the next major test of whether social platforms can be held liable for youth mental health outcomes. The California case is described as a bellwether that could influence thousands of similar claims filed by families across the country, many of whom tell strikingly similar stories about escalating screen time, anxiety and self-harm Meta and YouTube.

Judges and juries are now being asked to decide whether these harms are an unfortunate side effect of a useful technology or the foreseeable result of business models that prize engagement above all else. A slew of trials beginning this year seek to hold social media companies responsible for youth harms, and legal analysts describe them as a potential reckoning for an industry that has long argued it is merely a conduit for user speech rather than a manufacturer of risky products, a distinction that may not survive close scrutiny of how these systems are actually built and optimized reckoning for social.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.