
In 2000, a lawsuit filed on behalf of six Florida cigarette smokers against tobacco companies was the first major case to establish that smoking was addictive. The $145 billion jury award did not survive the appeals process, but the jury’s findings, that cigarettes were an addictive and harmful product, were used in a host of other suits against makers of tobacco products.
A verdict last week in California, in which a jury found that Meta and YouTube’s digital platforms had addictive features that contributed to a young woman’s mental health problems, could have similar significance in litigation against social media companies. After nearly 40 hours of deliberation, the 12-person jury ruled in the plaintiff’s favor, ordering Meta to pay $4.2 million in compensatory and punitive damages and YouTube, which is owned by Google, to pay $1.8 million.
More than 1,000 civil complaints against social media platforms over alleged harms to young users have been consolidated into nine cases in California. The first to face a jury involved a 20-year-old woman referred to as K.G.M., who said she began using social media at age 8, first by creating a YouTube account and then joining Instagram a year later. She alleged that the addictive quality of the platforms—including infinite scroll and algorithms that recommend content—caused her to develop mental health disorders like depression, anxiety, and body dysmorphia.
But K.G.M’s case and previous ones against Big Tobacco differ in two crucial ways. First, “tobacco is not a First Amendment-protected product,” Clay Calvert, a senior fellow of technology policy at the American Enterprise Institute, told The Dispatch. “When we’re dealing with speech services like social media platforms, then we start raising First Amendment questions.”
The other difference, though, is the question of whether social media addiction is behavioral rather than substantive. “It’s contested whether or not there really is such a thing as social media addiction,” Calvert said. “But that’s really what the plaintiffs are relying on, and their experts are saying, ‘Yes, it can be addictive, even though it’s not diagnostically recognized as a psychiatric disorder.’”
The bellwether case marks the first time that a plaintiff suing Big Tech has been able to circumvent Section 230—a legal provision that protects social media sites from being held liable for content posted by users—let alone win.
“The bottom-line impact of the K.G.M. verdict, especially combined with the New Mexico verdict, is that the tech industry is no longer untouchable.”
Mary Anne Franks
In a statement, Meta said it disagreed with the verdict and will appeal. “Teen mental health is profoundly complex and cannot be linked to a single app. We will continue to defend ourselves vigorously as every case is different, and we remain confident in our record of protecting teens online,” a company spokesperson stated.
Google also said it plans to appeal. “This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site,” Google spokesman Jose Castañeda said.
K.G.M.’s lawsuit originally included Snap Inc., the parent of Snapchat, and ByteDance, the company that developed TikTok, but those companies settled out of court for undisclosed amounts in January, just weeks before the trial began.
The verdict in California came less than 24 hours after a New Mexico jury found that Meta had violated state law by knowingly harming minors’ mental health and concealing information about the proliferation of child sexual exploitation on the app. The jury found that Meta had thousands of violations against the state’s Unfair Practices Act, ranging from making misleading statements to “unconscionable” trade practices, and assessed a penalty of $375 million. The back-to-back decisions could “profoundly affect Meta going forward in terms of [how] its platforms deliver content to minors,” said Calvert.
“Three hundred seventy-five million dollars is a ton of money, even if it’s a drop in the bucket, relative for Meta, because you have more than 30 state attorneys general filing such cases,” Calvert said.
The First Amendment question.
What makes the California case significant is the novel argument the plaintiff’s attorneys used to sidestep First Amendment concerns that have doomed previous lawsuits against Big Tech companies. They argued that the design features of Meta and YouTube’s platforms were the problem, rather than the content being disseminated.
“I’m going to show you evidence that these companies built machines designed to addict the brains of children,” attorney Mark Lanier said in his opening statement. “And they did it on purpose.”
This argument drew a distinction between how social media platforms are engineered and the statute that has previously shielded them from lawsuits. Section 230 of the Telecommunications Act of 1996 protects an internet platform from being treated “as the publisher or speaker of any information provided by another information content provider.” In other words, if someone posts libelous or otherwise illegal content on Instagram, Meta won’t be held responsible.
“The interpretation of [Section 230] for the last 30 years has been to provide tech companies, in particular social media platforms, with really extensive legal protections from being sued,” said Mary Anne Franks, a professor of technology law at George Washington University.
That has led to criticism from both lawmakers and legal analysts, who argue that the law—part of a push to address the possible risks associated with the internet while allowing tech innovation—has allowed tech companies to operate with little accountability.
While the Los Angeles jury’s decision does not necessarily change the way the courts will interpret Section 230 going forward, it offers a roadmap for future litigation.
Addictive by design?
The question for the jury was whether it was the content itself that caused K.G.M.’s mental health problems—which would have brought Section 230 into play—or if specific design features of the platforms, such as infinite scroll and beauty filters, played a role. Meta removed all filters that change a user’s appearance in online images, uploaded by third-party users last year, including modifying a user’s nose to make it appear smaller or making skin color lighter.
The plaintiffs presented a trove of documentation from Meta and Google to prove their point that the companies sought to capture and hold the attention of young users.
“If we want to win big with teens, we must bring them in as tweens,” one 2019 Meta memo displayed in court read. Another internal message from an Instagram employee stated that the company was “basically pushers” that caused “reward deficit disorder, because people are binging on Instagram so much they can’t feel the reward.”
Meta argued that K.G.M.’s mental health problems arose from issues outside of its platform, citing medical records and testimony from therapists that showed that K.G.M.’s mother had been verbally and physically abusive toward her. Meta’s lawyers also cited a medical report showing that K.G.M. had previously been traumatized after witnessing her sister’s suicide attempt.
Testifying at the trial, Meta founder and CEO Mark Zuckerberg said he regretted not having taken more decisive action previously to identify users aged 13 and under, but ultimately felt the company reached the “right place over time” regarding in-app protections for young users.
Impact on Big Tech.
Now, ahead of similar legal cases expected to go to trial in the coming months, companies like Meta and Google may alter their products’ design to avoid risking further legal action, including by changing their algorithms and removing features such as endless scroll. Additionally, Calvert said the recent case and similar ones could “embolden lawmakers at the federal level and at the state level to adopt more laws to protect minors from social media platforms,” which would offer a further incentive for Meta and Google to modify their platforms.
“The bottom-line impact of the K.G.M. verdict, especially combined with the New Mexico verdict, is that the tech industry is no longer untouchable,” said Franks. “More significant than the specific theory of liability or the dollar amount of damages is the fact that in two of the first cases to bring the internal communications and calculations of major tech platforms to light, juries concluded that they revealed deliberate choices to cause harm.”
Another impact, though, may be felt by smaller platforms trying to enter the social media market. How such platforms are designed and potential exposure to future litigation will have to be figured in, said Jennifer Huddleston, a senior fellow of technology policy at the Cato Institute.
“How might this impact smaller platforms who might be trying to provide alternatives, trying to provide safer services or not appealing to a teenage demographic?” she said. “What might that mean for a small platform who’s trying to get into this market and perhaps seeking funding?”
















