A California Jury Just Found Meta and Google Guilty of Addicting a Child. Here’s What That Means.

A Los Angeles jury delivered a verdict on March 25, 2026, that the technology industry has spent years and hundreds of millions of dollars trying to prevent: Meta and Google are legally liable for designing their platforms to addict child users. The jury awarded $3 million in damages to a 20-year-old plaintiff identified as KGM (or Kaley) after more than 40 hours of deliberation across 9 days. The jury will now separately consider whether Meta and Google owe additional punitive damages for malice or fraud.

The verdict is the first of its kind. It will not be the last.

What This Case Was Actually About

To understand why this verdict matters, start with 1 question most people have never considered: when a social media platform is designed to keep you scrolling, who is responsible for what happens next?

Social media addiction lawsuits against tech companies center on platform design, not the content users post, but the engineered systems platforms use to hold attention. Features like infinite scroll, push notifications, algorithmic amplification, and autoplay are not accidents of engineering. They are deliberate design decisions made by product teams with access to internal research showing these features increase usage and, in many cases, cause measurable psychological harm, particularly in children and teenagers.

The plaintiff in this case, Kaley, began using YouTube at age 6 and Instagram at age 9. By her teenage years, she was addicted to both platforms in ways her legal team argued were not incidental but engineered. Her case against Snap and TikTok settled before trial began on undisclosed terms. Meta and Google chose to fight and lost.

What Section 230 Is and Why It Matters Here

Section 230 of the 1996 Communications Decency Act is the legal provision that has historically protected social media companies from liability for content posted by their users. It is the reason Facebook cannot be sued for a harmful post a stranger writes, and YouTube cannot be held responsible for a dangerous video a creator uploads. For 30 years, Section 230 has been the primary legal shield protecting tech platforms from accountability for what happens on them.

Meta and Google’s defense strategy in this case leaned heavily on Section 230, arguing that any harm Kaley suffered came from user-generated content she encountered on the platforms, not from anything the companies themselves built. The jury rejected that argument.

The distinction the verdict establishes is foundational: Section 230 protects companies from liability for what users post. It does not protect companies from liability for how they designed the product that users post on. Infinite scroll is not user-generated content. Push notifications are not user-generated content. Autoplay is not user-generated content. These are engineering decisions made inside corporate product teams, and engineering decisions carry product liability.

Jose Castaneda, a Google spokesperson, told Al Jazeera: “We disagree with the verdict and plan to appeal. This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.” Meta stated it “respectfully disagrees with the verdict” and is evaluating legal options. Both companies are expected to appeal.

What the Evidence Actually Showed

The trial produced a documented public record of what Meta and Google knew about their platforms’ effect on young users, and when they knew it. Internal research confirming addictive design harms was disclosed. The plaintiff’s legal team argued that both companies deployed these features deliberately to boost usage metrics, knowing the psychological cost to younger users.

According to Pew Research Center survey data cited in the case, 36% of U.S. teenagers report using TikTok, YouTube, Instagram, Snapchat, or Facebook “almost constantly.” That figure is not a side effect of good product design. It is the product design working as intended.

Kaley’s lawyers stated after the verdict: “She showed extraordinary courage in bringing this case and telling her story in open court. A jury of Kaley’s peers heard the evidence, heard what Meta and YouTube knew and when they knew it, and held them accountable for their conduct.”

Meta argued throughout the trial that Kaley’s mental health struggles were separate from her social media use and that none of her therapists had identified social media as the direct cause. The jury did not need to find social media caused her struggles, only that it was a “substantial factor” in causing her harm. The jury found it was.

How Big This Actually Is

The March 25 verdict is 1 data point in what is becoming a coordinated legal campaign against social media platform design across 4 simultaneous fronts.

2,000+ pending plaintiffs across multiple cases accuse Meta, Snapchat, TikTok, and Alphabet of knowingly designing addictive products that exposed children to predators, exploitation, and self-harm. Plaintiffs include individual teenagers, school districts, and state attorneys general.

A federal social media addiction case involving several states and school districts is scheduled to go to trial in Oakland, California, in June 2026.

A second Los Angeles state trial involving Instagram, YouTube, TikTok, and Snapchat is scheduled for July 2026.

A New Mexico jury found Meta violated state law on the day before the California verdict, holding the company liable for misleading users about the safety of Facebook, Instagram, and WhatsApp and for enabling child sexual exploitation on its platforms.

CaseJurisdictionStatusParties
KGM v. Meta & GoogleLos Angeles, CAVerdict: liable, $3M damagesMeta, Google
State AG federal caseOakland, CATrial begins June 2026Meta, Snapchat, TikTok, Alphabet
Second LA state trialLos Angeles, CATrial begins July 2026Instagram, YouTube, TikTok, Snapchat
New Mexico AG caseNew MexicoVerdict: Meta liableMeta

Entertainment lawyer Tre Lovell told Al Jazeera the verdict signals a structural shift: “The fact that the jury found Meta and Google liable represents that these cases have real exposure to the social media giants, and are going to frame how future litigation will proceed. Although this case will certainly be appealed, I would not be surprised if Meta and Google are already making changes within their platform to reflect the real exposure.”

What Happens to Platform Design Now

The verdict’s most consequential long-term implication is not the $3 million in damages, which is financially negligible for companies worth trillions of dollars. The consequential implication is the design accountability framework that the verdict establishes.

If platform design features can generate legal liability, every product decision Meta and Google make about notifications, scroll behavior, autoplay, and algorithmic amplification now carries legal exposure. At least 20 U.S. states enacted social media laws in 2025 regulating children’s usage and age verification requirements. 

The legislative response is not limited to the US states. Greece formally announced a social media ban for all children under 15, taking effect January 2027, backed by DSA fine structures reaching 6% of global annual turnover, while simultaneously calling on the European Commission to establish a unified Digital Age of Majority across all EU member states by the end of 2026.

NetChoice, the tech industry trade association backed by Meta and Google, is actively challenging these laws in court. The California verdict gives those state-level legal efforts significantly stronger precedent to build on.

Government pressure on Meta is escalating on multiple fronts simultaneously. The Philippines issued Meta CEO Mark Zuckerberg a direct 7-day ultimatum to suppress disinformation or face criminal prosecution under national cybercrime law, a demand backed by the same pattern of argument the California verdict just validated: that platform inaction in the face of documented harm carries legal consequences.

The $3 million awarded to Kaley covers pain and suffering. The jury has not yet decided whether punitive damages for malice or fraud apply. Punitive damages, designed to punish deliberate harmful conduct rather than compensate victims, could produce a materially larger final figure.

Meta’s stock rose 0.7% on the day of the verdict, partially cushioned by Mark Zuckerberg’s simultaneous appointment to a White House advisory council. Alphabet fell 1%. Neither movement reflects the actual stakes of what was decided on March 25, 2026.

A child started using YouTube at age 6. A jury of her peers decided the company that built the product she used bore legal responsibility for what that product did to her. That decision is now part of the legal record, and 2,000 more plaintiffs are waiting to use it.

Tech accountability, platform regulation, and the legal battles reshaping how social media companies operate are covered at The IT Horizon. Subscribe to our newsletter. We track every verdict, legislation, and corporate decision that affects how technology is built and who is responsible when it causes harm.

Join the IT Horizon Community

Stay connected with a community of curious minds following the ideas, breakthroughs, and disruptions shaping our digital future. Join the conversation.

Related blogs

Top Stories

April 14, 2026

Google Maps Just Got Its Biggest Upgrade in a Decade, and It Changes Everything About How You Find Places

April 14, 2026

Japan Just Bet $16 Billion on a Chip Startup Nobody Had Heard of 3 Years Ago

April 14, 2026

Blue Light and Sleep: Why Your Phone Isn’t the Real Reason You’re Tired at Night

April 14, 2026

Trump Posted an AI Image of Himself as Jesus, Then Deleted It After His Own Base Turned on Him

April 14, 2026

Has Neuralink Made a Miscalculation? The Reality Behind the Hype

April 14, 2026

Art schools vs AI: adaptation or erosion?