In a historic legal shift, juries in Los Angeles and New Mexico have ordered Meta and Alphabet's Google to pay hundreds of millions in damages. The verdicts find the tech giants negligent in platform design, specifically citing 'toxic algorithms' that led to depression and child exploitation. These rulings represent the first successful bypass of Section 230 immunity by focusing on product architecture rather than hosted content.

Landmark Damages Awarded

A New Mexico jury ordered Meta to pay $375 million for misleading users on safety, while a Los Angeles jury awarded $6 million to a plaintiff addicted since age six.

Section 230 Shield Pierced

By targeting 'infinite scrolling' and addictive design features as product defects, plaintiffs successfully circumvented traditional federal legal protections for tech firms.

International Legislative Ripple Effect

Italian senators have proposed a bill to ban 'algorithmic addiction,' while French experts call for global health-based regulations for social networks.

Two U.S. juries delivered landmark verdicts against Meta Platforms and Alphabet's Google within 24 hours of each other, finding the tech giants liable for harm to minors caused by the design of their social media platforms. A Los Angeles jury on Wednesday ordered Meta and Google to pay a combined 6 (million dollars) — total damages awarded in Los Angeles case to Kaley G.M., a 20-year-old Californian who said she developed depression, anxiety, and suicidal thoughts after becoming addicted to Instagram and YouTube from the age of six. Of that sum, $3 million covers compensatory damages and $3 million covers punitive damages, with Meta bearing 70% of the total liability. A separate New Mexico jury on Tuesday ordered Meta alone to pay 375 (million dollars) — damages in New Mexico case against Meta after finding the company misled users about platform safety and enabled child sexual exploitation on Facebook and Instagram. Both companies denied the claims and announced plans to appeal. TikTok parent ByteDance and Snapchat parent Snap Inc. had settled with Kaley G.M. out of court before the Los Angeles trial began.

Section 230 shield cracked by product-design argument The verdicts are legally significant because they navigate around Section 230 of the Communications Decency Act, the 1996 federal law that has historically protected online platforms from lawsuits over user-generated content. Plaintiffs in both cases argued that the harm stemmed not from content itself but from deliberate design choices — including infinite scrolling, algorithmic recommendations, autoplay video, and notification systems engineered to maximize engagement. In both cases, Meta and Google sought pre-trial dismissal by invoking Section 230 immunity, and in both cases judges rejected that argument and allowed the cases to proceed to juries. Gregory Dickinson, an assistant professor at the University of Nebraska College of Law who studies the intersection of technology and law, said courts are increasingly distinguishing between claims about platform functionality and claims that would impose liability for third-party speech, according to Reuters. No appellate court has yet ruled on whether design choices are protected by Section 230, meaning the expected appeals from Meta and Google could produce binding precedent that shapes thousands of pending cases. More than 2,400 similar lawsuits have been centralized before a single federal judge in California, with thousands more consolidated in California state court.

Section 230 was enacted as part of the Communications Decency Act of 1996 and has served as the foundational legal protection for internet platforms in the United States for three decades. The Los Angeles trial was designed as a bellwether case — a test proceeding whose outcome is used by judges and attorneys to assess the potential value of remaining claims and guide settlement negotiations. The legal strategy employed by plaintiffs draws on the approach used against the tobacco industry in the 1990s, when manufacturers were accused of concealing the addictive and harmful nature of their products, a campaign that ultimately transformed public perception of cigarettes and forced industry-wide changes.

Families celebrate outside court as Meta blames difficult childhoods Outside the Los Angeles courthouse, families of children harmed by social media platforms reacted with relief and emotion to the verdict. Lori Schott, a Colorado farmer who traveled more than 1,800 kilometers to attend the ruling, told AFP the decision confirmed "that our children were being harmed" and would make the world safer. Schott lost her 18-year-old daughter Annalee to suicide; she said her daughter left a note explaining she felt ugly after constantly comparing herself to filtered images of women on social media. During the trial, lawyers for Meta and Google argued that Kaley G.M.'s mental health problems were more likely rooted in difficult family circumstances — including a neglectful father, a troubled mother, and a sister who had attempted suicide — rather than platform use. Schott rejected that framing directly, comparing the defense strategy to the behavior of a predator attacking a victim. Kaley G.M.'s lawyers said social media companies had for years profited by targeting minors while concealing the design features that made their platforms dangerous. The Los Angeles jury was composed of seven women and five men and, according to Il Giornale, required pressure from the presiding judge to reach a verdict and avoid a costly retrial.

Los Angeles — Meta and Google (combined): 6, New Mexico — Meta alone: 375

Italian senators propose bill banning algorithmic addiction The U.S. verdicts prompted a legislative response in Italy, where Partito Democratico senators Antonio Nicita and Lorenzo Basso introduced a bill in the Italian Senate aimed at banning what they termed "algorithmic addiction" and "algorithmic influence." The proposed legislation would introduce those practices among prohibited conduct when specific conditions are met, and would strengthen the accountability of leaders of major digital platforms and artificial intelligence systems within the existing European regulatory framework. Nicita and Lorenzo Basso said in a statement that algorithmic design and platform interfaces "are not neutral" and can induce forms of addiction with "concrete and measurable effects on people, especially minors." Psychoanalyst Michaël Stora, cofounder of the Observatory of Digital Worlds in the Human Sciences, told franceinfo on Thursday that the verdict was "rather interesting" but that society needed to go much further, calling for legislation to prevent social networks from harming adolescent mental health. Stora also called for social media addiction to be formally recognized by the World Health Organization, noting that the only form of digital addiction the WHO currently recognizes is addiction to online video games. „The big tech companies capitalize on our vulnerabilities, our weaknesses.” — Michaël Stora via Franceinfo

Mentioned People

  • Antonio Nicita — Senator Republiki z ramienia Partito Democratico od 13 października 2022 roku
  • Raúl Torrez — Prokurator generalny Nowego Meksyku i członek Partii Demokratycznej
  • Lorenzo Basso — Włoski polityk urodzony w 1976 roku
  • Michaël Stora — Psychoanalityk i współzałożyciel Observatory of Digital Worlds in the Human Sciences

Sources: 65 articles