Morning Overview

Big tech faces mounting lawsuits over teen social media addiction claims

A 20-year-old California woman named Kaley G.M. took the stand this week in a federal courtroom in Oakland, telling jurors that she became addicted to social media at age six and spent “all day long” on platforms like Instagram and YouTube as a child. Her testimony is a key moment in what news coverage has described as a bellwether-style trial against Meta and Google’s YouTube, two of the largest technology companies on Earth. The case sits amid a growing web of lawsuits filed by state attorneys general, families, and school districts that allege major social media platforms were designed to keep young users engaged in ways that can harm their mental health.

A Plaintiff Describes Childhood Consumed by Screens

Kaley G.M. told the court that social media use took a toll on her mental health, describing how her relationships with friends and family became anxious and strained over years of compulsive use. She recounted starting on YouTube and Instagram in early childhood and said that by her preteen years she felt unable to stop scrolling, even when she wanted to sleep or focus on school. According to reporting on the trial, she described an escalating pattern in which she checked her feeds first thing in the morning and last thing at night, tying her self-worth to likes and comments and feeling panic when she was separated from her phone.

In testimony previewed before she took the stand, Kaley said she could not put down her cellphone without feeling further angst, a pattern she and her lawyers argue matches the dynamics of behavioral addiction. Her account is central to the plaintiffs’ theory that platforms were not merely passive hosts for content but were designed to keep children engaged for as long as possible. Features like endless feeds, algorithmic recommendations, and push notifications are presented to the jury as design choices that, plaintiffs argue, exploited vulnerabilities of young users who lacked the maturity to recognize or resist those hooks.

A Bellwether Trial With Industry-Wide Stakes

The Oakland proceeding is being treated as a bellwether case, meaning its outcome could influence how many similar claims are handled across the country. Families, school districts, and local governments have filed suits alleging that social media companies knowingly created products that harmed children’s mental health, and the current trial is the first to test those allegations before a jury. AP has reported that some other companies named in related cases have reached settlements, but Meta and Google’s YouTube have chosen to fight the accusations in court rather than negotiate a broader resolution.

That decision has put this case under intense scrutiny from parents, regulators, and the tech industry itself. A verdict in favor of the plaintiffs could encourage additional lawsuits and strengthen the negotiating position of states and school systems seeking damages or reforms. A defense win, by contrast, might discourage some future claims and bolster the companies’ argument that social media risks are too diffuse and individualized to ground liability. Either way, legal observers see the trial as an early test of whether juries are willing to connect platform design decisions to real-world harm experienced by teenagers and young adults.

Tech Giants Contest the Science of Addiction

Meta and Google have mounted an aggressive defense that challenges both the factual narrative and the underlying science. Attorneys for the companies have pushed back on addiction claims, emphasizing that there is no standalone diagnosis of “social media addiction” in the primary psychiatric reference manual. They argue that while problematic use exists, it is not comparable to substance addiction in either diagnostic criteria or biological mechanisms, and they contend that the plaintiffs’ experts are stretching emerging research beyond what the data can support.

Defense lawyers have also highlighted alternative explanations for rising rates of teen depression and anxiety, pointing to academic pressure, economic insecurity, family conflict, and pre-existing mental health conditions. In their telling, platforms like Instagram and YouTube are tools that can be used in healthy or unhealthy ways, much like television or video games in earlier generations. The companies stress that they offer controls to limit screen time, tailor content, and restrict notifications, portraying these options as evidence that they encourage responsible use rather than compulsive engagement. Internal documents produced in discovery are being contested over how to interpret them, with plaintiffs claiming they show knowledge of youth harms while the companies insist they reflect ordinary risk assessment in a complex business environment.

State Attorneys General Build a Multifront Legal Campaign

The federal trial unfolds against a backdrop of mounting public enforcement actions led by state attorneys general. California Attorney General Rob Bonta has filed a lawsuit in federal court alleging that Meta deployed addictive design features, misrepresented the safety of its products to the public, and violated children’s privacy laws by collecting data from users under 13. That filing describes infinite scroll feeds, autoplay video, and frequent notifications as engagement-maximizing tools that exploit developmental vulnerabilities in children and teens, who are more susceptible to peer comparison and reward cues.

New York Attorney General Letitia James joined the same coalition, with her office explaining that the states seek injunctive relief, civil penalties, and restitution for what they describe as harms to young people’s mental health and academic performance. The multistate initiative is designed to pressure Meta into structural changes, not merely monetary payouts, by targeting the business incentives that favor time-on-platform over safety. Separately, Massachusetts Attorney General Andrea Campbell has brought a state-court case accusing Meta of unfair and deceptive practices centered on fear-of-missing-out mechanics and social comparison features. Portions of that complaint remain impounded, suggesting that some of the evidence state lawyers regard as most telling about internal decision-making is still shielded from public view while litigation proceeds.

Why Parental Controls Are Not the Full Answer

Central to the defense narrative is the claim that parents, not platforms, bear primary responsibility for managing children’s screen time. The companies point to content filters, time limits, and other safety settings as proof that they have equipped families with the tools necessary to supervise digital use. Critics, however, argue that this framing overlooks the structural power of design choices such as algorithmic recommendation engines, intermittent reward schedules in notification systems, and feeds engineered to remove natural stopping points. In their view, asking parents to counter those forces with simple toggles is akin to asking someone to neutralize a casino’s slot machines by occasionally glancing at a clock.

State lawsuits explicitly describe this mismatch as a core deception: platforms publicly promoted safety features while internally prioritizing engagement metrics that undercut those tools. Public data portals such as California’s OpenJustice site have begun to aggregate information on youth mental health, school climate, and online safety, giving policymakers a clearer picture of the broader context in which these products operate. If courts ultimately decide that platforms must redesign products to align with minors’ developmental needs rather than maximizing attention, parental controls may come to be seen as a supplement to, rather than a substitute for, systemic safeguards embedded in the technology itself.

What the Outcome Could Mean for the Future of Social Media

The verdict in Kaley G.M.’s case will not, by itself, settle the national debate over youth and social media, but it could set important guideposts. A ruling that finds Meta and YouTube liable for design-related harms to a single young user would signal that juries are willing to treat recommendation algorithms, reward systems, and interface choices as potential defects, not just neutral features. That, in turn, could embolden regulators and lawmakers to press for age-appropriate design codes, default time limits for minors, and stricter data-collection rules, building on the theories advanced in the multistate attorney general suits.

If the companies prevail, they are likely to argue that the outcome validates their emphasis on parental responsibility and user choice, reinforcing a narrative that social media risks are too individualized to ground broad legal accountability. Even then, the testimony in Oakland and the parallel state actions have already pushed questions of youth safety and platform design into the center of public discussion. Whatever the jury decides, the trial underscores that the era of unexamined growth for social media giants is ending, replaced by a period in which courts, regulators, and families are all asking a harder question: not just what these products let children do, but what they are built to make them do.

More from Morning Overview

*This article was researched with the help of AI, with human editors creating the final content.