Morning Overview

Los Angeles jury finds Meta and Google negligent in addiction case

A Los Angeles County Superior Court jury on March 25, 2026, found Meta and Google-owned YouTube negligent in how they designed and operated Instagram and YouTube, ruling that both platforms were a substantial factor in harming a teenage plaintiff identified as KGM. The verdict, which assigned greater responsibility to Meta than to YouTube and included $900,000 in damages from YouTube alone, represents the first time a jury has held major social media companies liable for addictive platform design targeting minors.

What the Jury Decided

The jury determined that both Meta and YouTube were negligent in the design and operation of their platforms, and that this negligence was a substantial factor in harm to KGM. Jurors concluded that Meta bears more responsibility than YouTube, a distinction reflected in the damages split. YouTube was ordered to pay $900,000 in damages, with the remainder assigned to Meta.

The case was tried in California’s trial courts and served as a bellwether, meaning its outcome could shape how thousands of similar lawsuits proceed across the country. The plaintiff, referred to publicly by the initials KGM and identified by the first name Kaley, was diagnosed with body dysmorphia, a condition that causes people to worry excessively about their physical appearance. The lawsuit alleged that Instagram and YouTube knew their apps harmed children yet continued operating features designed to maximize engagement among young users.

Jurors were not asked to decide whether social media is inherently dangerous for teenagers. Instead, they focused on whether these particular platforms, as designed and operated during the years KGM used them, failed to meet a reasonable standard of care. The verdict indicates that the panel found the combination of recommendation algorithms, endless feeds, and appearance-centered content to be unreasonably risky when directed at minors, especially those already struggling with self-image.

How Meta and Google Defended Themselves

Throughout the trial, attorneys for Meta and Google challenged the core theory of the case. They disputed that social media addiction constitutes a recognized condition and argued that the plaintiff’s legal team had not established a direct causal link between platform use and KGM’s mental health struggles. Google’s defense also pushed back on how YouTube was characterized, contending that the video platform operates differently from image-driven social networks like Instagram.

Defense lawyers emphasized that teenagers today navigate a complex environment that includes school pressures, family dynamics, and offline social interactions. They argued that isolating Instagram and YouTube as the decisive cause of KGM’s distress oversimplified that reality. Meta’s attorneys highlighted tools such as time-management dashboards and parental controls, suggesting that the company had taken reasonable steps to help families manage use.

These arguments failed to persuade the jury. The distinction between YouTube and Instagram did, however, appear to influence how jurors apportioned blame. By assigning a larger share of responsibility to Meta, the panel signaled that it found Instagram’s design features, particularly those related to appearance and peer comparison, more directly tied to the harm KGM experienced. That finding carries weight for future litigation because it suggests juries may evaluate each platform’s specific mechanics rather than treating all social media as a single category.

Closing Arguments and Jury Instructions

Before deliberations began, lawyers made final appeals to the jury, and the judge emphasized certain instructions on the negligence standard jurors would need to apply. The plaintiff’s legal team urged the panel to weigh evidence of what the companies knew internally about the risks their products posed to young users. They pointed to internal documents and expert testimony suggesting that the platforms were designed to keep users engaged for as long as possible, even when extended use correlated with worsening mental health.

Defense counsel, by contrast, asked jurors to focus on the absence of a formal medical consensus around social media addiction and the difficulty of isolating platform use from other factors in a teenager’s life. They contended that imposing liability based on emerging science could chill innovation and unfairly punish companies for offering widely used communication tools.

The fact that the judge chose to stress particular instructions before sending the jury to deliberate suggests the court recognized the novelty of applying traditional negligence law to algorithmic platform design. Negligence cases typically require proving that a defendant owed a duty of care, breached that duty, and caused measurable harm. Applying that framework to recommendation algorithms and infinite-scroll interfaces required jurors to treat software design choices as conduct that can be evaluated for reasonableness, much like a manufacturer’s decision about a physical product.

TikTok and Snap Settled Before Trial

Meta and YouTube were the only defendants remaining when the verdict was delivered. TikTok and Snap resolved their claims before the trial began, removing themselves from the courtroom but not from the broader legal picture. The terms of those settlements have not been publicly disclosed, but the timing is telling. Both companies chose to avoid the risk of a jury verdict rather than defend their platforms’ design choices under oath.

Their exit left Meta and Google to absorb the full weight of the trial’s public scrutiny. For Meta, the result compounds years of internal controversy over Instagram’s effects on teenage body image. For Google, the verdict introduces a new legal exposure. YouTube has faced regulatory pressure over children’s content before, but a negligence finding tied to addictive design is a different category of liability entirely. Even without a formal medical diagnosis of addiction, the jury’s decision suggests that design choices aimed at maximizing engagement can be framed as unsafe when they target minors.

What This Means for Pending Litigation

Because this case was structured as a bellwether, its outcome will directly influence settlement negotiations and trial strategies in hundreds of similar suits consolidated in courts across the country. The California court system noted when the trial began that the lawsuit alleged Instagram and YouTube knew their apps harmed kids, a framing that now carries the endorsement of a jury finding.

Plaintiffs’ attorneys in other cases will likely cite the verdict’s apportionment of blame as a template. The jury’s willingness to differentiate between platforms, holding Meta to a higher standard than YouTube, gives future litigants a reason to tailor their arguments to each defendant’s specific product features rather than relying on a blanket theory of social media harm. Lawyers may focus on internal research, age-verification practices, and the prominence of features such as like counts, filters, and recommendation feeds when arguing that a platform failed in its duty of care to young users.

The verdict also sends a signal to judges overseeing similar cases. By allowing the negligence claims to reach a jury and by instructing jurors on how to apply longstanding legal concepts to algorithmic products, the court has effectively validated a roadmap for future trials. Other courts may now be more inclined to let juries hear evidence about product design, internal risk assessments, and the ways companies market their services to children and teenagers.

For the tech industry, the decision raises the prospect of substantial financial and reputational exposure. Even if appeals narrow or overturn parts of the verdict, the finding that Instagram and YouTube were negligent in their design choices will likely accelerate legislative and regulatory efforts aimed at youth safety online. Companies may face pressure to redesign feeds, limit appearance-focused metrics, or provide more robust tools for parents and young users to control what they see and how long they stay online.

For families and educators, the case underscores a growing recognition that the architecture of social media, not just the content users encounter, can shape young people’s mental health. The jury’s conclusion that design decisions were a substantial factor in KGM’s harm suggests that future debates over children’s online safety will focus less on individual responsibility and more on the obligations of the companies that build and profit from the platforms teens use every day.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.