Command Palette

Search for a command to run...

Discover

Los Angeles Jury Holds Meta, YouTube Liable for Teen Addiction, Awards $6M

No image

A Los Angeles jury found Meta and Google’s YouTube negligent for designing addictive platforms that harmed a young woman’s mental health, ordering the companies to pay a combined $6 million in the first U.S. jury verdict to hold major social media firms liable for “addiction” harms to youth nytimes +1. The decision on Wednesday assigned 70% of the responsibility to Meta and 30% to Google, and is expected to shape thousands of similar lawsuits pending across the country nytimes +1.

The plaintiff, identified as 20‑year‑old Kaley G.M., testified that she began using YouTube at age 6 and Instagram at 11 and developed compulsive use, eating disorders, anxiety and self‑harm that she and her lawyers linked to platform design choices such as infinite scroll, autoplay, algorithmic recommendations and beauty filters npr +1. After a roughly seven‑week trial and more than 40 hours of deliberations, jurors awarded $3 million in compensatory damages and $3 million in punitive damages, concluding the companies knew or should have known their products posed particular risks to children and failed to warn families npr +1.

How a Design-Focused Case Broke New Legal Ground

The Los Angeles Superior Court case marked a strategic shift in how plaintiffs are trying to hold tech platforms accountable, framing Instagram and YouTube not as neutral hosts of user content but as defective products whose design features were engineered to maximize engagement among children and teens latimes +1. By centering algorithms, notifications and recommendation systems rather than specific posts, lawyers sought to sidestep Section 230, the federal law that has long shielded platforms from liability over user‑generated content npr +1.

Jurors saw internal Meta documents, including a line reading, “If we wanna win big with teens, we must bring them in as tweens,” which plaintiffs argued showed deliberate targeting of very young users despite known mental‑health risks nytimes. Executives including Meta CEO Mark Zuckerberg, Instagram chief Adam Mosseri and YouTube engineering vice president Cristos Goodrow were called to testify about growth strategies and safety controls, further underscoring comparisons some advocates drew to tobacco litigation that exposed corporate knowledge about addictive products decades ago latimes +1.

Industry Backlash and the High-Stakes Appeal to Come

Meta and Google immediately said they would appeal, signaling a lengthy battle over whether product‑design theories can overcome established protections for online speech and innovation nytimes +1. “Teen mental health is profoundly complex and cannot be linked to a single app,” a Meta spokesperson said, insisting the company has invested in safety tools and parental controls npr. Google spokesperson José Castañeda argued the case “misunderstands YouTube,” describing it as a responsibly built streaming platform rather than a social network npr.

Legal scholars warned the verdict could be narrowed or overturned in higher courts, pointing to unresolved questions about how negligence and product‑liability doctrines apply to recommendation algorithms and attention‑driven interfaces reuters. Yet plaintiffs’ lawyers said “accountability has arrived” for an industry facing roughly 1,500–2,000 related cases from families, school districts and state attorneys general, and Wednesday’s decision came just a day after a New Mexico jury ordered Meta to pay $375 million in a separate child‑safety case washingtonpost +1.

The Bigger Picture

While the $6 million judgment is financially modest for two of the world’s richest tech firms, the back‑to‑back jury losses increased pressure on platforms to rethink features that keep children online longer and on lawmakers to impose design standards for youth safety washingtonpost +1. Even if appeals succeed, the Los Angeles verdict provided a legal and political template for future challenges to engagement‑driven business models, signaling that decisions once left to engineers and growth teams may increasingly be scrutinized by juries — and, ultimately, by the Supreme Court.