The jury in a bellwether social media addiction trial told Los Angeles Superior Court Judge Carolyn Kuhl on March 23, 2026, that it could not reach consensus, throwing the closely watched case against Meta and Google-owned YouTube into uncertainty. The disclosure came after days of deliberations at the Spring Street Courthouse in Los Angeles, where jurors had been asked to decide whether the platforms’ design features caused real harm to a young user. The impasse raises hard questions about whether American courts can resolve the growing clash between technology companies and families who say their children were damaged by algorithmically driven feeds.
What the Jury Told the Judge
Jurors sent a note to Judge Kuhl stating they were having difficulty coming to consensus on the central claims in the case. The trial is classified as a bellwether, meaning its outcome was designed to guide hundreds of similar lawsuits filed by families, attorneys general, and school districts across the country. A deadlock, if it holds, would deny both sides the precedent-setting verdict they sought and could force plaintiffs and defendants back to square one on the question of whether social media platforms can be held liable for addictive design.
The case had already narrowed before deliberations began. TikTok and Snap settled before trial, leaving Meta Platforms Inc. and Alphabet Inc.’s YouTube as the remaining defendants. Those settlements removed two major players from the courtroom but also deprived the jury of a fuller picture of the industry’s practices, concentrating the legal fight on just two companies.
A Plaintiff Who Said She Was Online “All Day Long”
At the center of the trial is a young woman identified by her initials, K.G.M., who testified that she began using YouTube and Instagram at a young age and described being on social media “all day long” as a child. Her account painted a picture of compulsive use that she said led to severe mental health consequences, including anxiety and depression. The plaintiff’s legal team argued that the platforms’ recommendation algorithms, autoplay features, and notification systems were engineered to maximize engagement in ways that exploited developing brains.
K.G.M.’s testimony was supported by her therapist, who took the stand to describe the clinical effects she observed. Local broadcast coverage tracked through KNBC records documented the therapist’s account as a significant moment in the trial, underscoring the human impact behind the legal theories. The defense, however, challenged the causal link between platform use and the plaintiff’s mental health struggles, arguing that many factors contribute to adolescent distress and that no controlled study has proven social media alone causes clinical addiction.
Instagram’s Chief Rejected the Addiction Label
One of the trial’s most contentious exchanges came when Adam Mosseri, the head of Instagram, testified that he does not believe people can get clinically addicted to social media. Mosseri drew a distinction between clinical addiction and what he called “problematic use,” a framing that sought to lower the legal stakes by suggesting that even heavy social media consumption does not meet the medical threshold for addiction as defined in psychiatric literature.
That distinction may well be at the heart of the jury’s struggle. If jurors accepted Mosseri’s framing, they would need to find that the plaintiff’s experience, however painful, did not constitute the kind of design-caused injury the lawsuit alleged. If they rejected it, they would be declaring that one of the most powerful technology executives in the world was wrong about his own product’s effects. Neither conclusion is simple, and the jury’s reported difficulty suggests the panel may be split along exactly this fault line.
Why the Deadlock Matters Beyond This Case
Bellwether trials exist to create efficiency. Courts consolidate large numbers of similar claims and try a representative case first so that both sides can gauge how juries respond to the core evidence. A clear verdict for the plaintiff would have pressured Meta and Google to settle the broader wave of litigation. A clear defense win would have weakened the hand of families and school districts pressing claims. A hung jury does neither.
If Judge Kuhl ultimately declares a mistrial, the case would need to be retried or settled, and the hundreds of related lawsuits filed by attorneys general and school districts would remain in limbo. For families waiting for accountability, the delay is not abstract. Every month without resolution is another month in which the legal system offers no answer to the question of whether platforms owe a duty of care to young users.
The technology companies, meanwhile, benefit from ambiguity. Without a binding jury finding that their designs are defective, Meta and Google face no immediate court-ordered changes to how their algorithms serve content to minors. They can continue to frame the issue as one of personal responsibility and parental oversight rather than product liability, a position that Mosseri’s testimony explicitly advanced. Company lawyers have emphasized that billions of people use their products without apparent harm and that tools already exist for parents to monitor and limit their children’s time online.
The Addiction Debate the Jury Could Not Resolve
Much of the public conversation about youth and social media treats the link between heavy use and mental health harm as settled. Inside the courtroom, that certainty evaporated. Expert witnesses parsed studies that show correlations between time spent online and anxiety or depression, but they acknowledged that correlation is not causation. Defense experts pointed to research suggesting that teens who are already struggling may simply gravitate toward online spaces, while plaintiffs’ experts argued that design choices like infinite scroll and push notifications intensify underlying vulnerabilities.
For jurors, the task was not to resolve the entire scientific debate but to decide whether, on the evidence presented, Meta and YouTube knowingly deployed features that created an unreasonable risk to children like K.G.M. The law demands a clear finding on duty, breach, causation, and damages; the science, still evolving, offers only probabilities. That gap between legal standards and emerging research may have made unanimity elusive.
The terminology itself complicated matters. “Addiction” carries cultural and legal weight, evoking comparisons to tobacco or opioids. Yet no major diagnostic manual currently lists social media addiction as a formal disorder, even as clinicians increasingly describe patients whose online behavior looks and feels compulsive. Mosseri’s insistence that what users experience is “problematic use” rather than addiction gave jurors a linguistic off-ramp: they could sympathize with K.G.M. without labeling the platforms’ design as inherently defective.
At the same time, the plaintiff’s lawyers urged jurors to look past labels and focus on design intent. Features like autoplay, algorithmic recommendations, and streak-based rewards were presented as evidence that the companies measured success in minutes and hours of attention, not user well-being. Internal documents, some shown in court, were used to suggest that executives understood the risks to young users but prioritized growth. Meta and Google disputed that characterization, saying their teams invest heavily in safety tools and that any harms are unintended side effects of products used by billions.
What Comes Next for Courts and Platforms
The immediate question is procedural: Judge Kuhl must decide whether to issue an “Allen charge,” urging jurors to keep working, or to declare a mistrial if further deliberations seem futile. Either way, the case has already signaled to both sides how difficult it will be to win a clean victory in front of a lay jury asked to weigh complex science and powerful emotional testimony.
Beyond this courtroom, the deadlock may shift attention back to legislatures and regulators. If juries cannot agree on whether current law makes platforms liable for addictive design, lawmakers may face renewed pressure to clarify the standards, particularly for products used by minors. Some advocates argue for age-specific design rules, while industry groups warn that rigid regulations could stifle innovation and undermine free expression.
For now, Meta and YouTube avoid the reputational blow of a plaintiff’s verdict but also lose the certainty that a defense win might have brought. Plaintiffs’ lawyers, likewise, are spared a precedent that could have chilled hundreds of pending claims, yet they must prepare for the cost and uncertainty of trying the same issues again. The broader public is left with the same uneasy status quo: platforms that feel indispensable and, to many parents, ungovernable.
Whatever the ultimate outcome, the Los Angeles trial has already reframed the conversation. By forcing executives, clinicians, and a young woman who grew up online to testify under oath, it has turned abstract fears about “screen time” into a concrete legal dispute about duty and design. The jury’s struggle to reach consensus is itself a verdict of sorts, not on the facts of this single case, but on how hard it will be for courts to keep pace with technologies that shape childhood long before the law knows what to call them.
More from Morning Overview
*This article was researched with the help of AI, with human editors creating the final content.