Translate

Thursday, 26 March 2026

 


From Attention Economy to Legal Accountability:

Social Media Platforms and the Erosion of Digital Immunity


Farid Novin. Ph.d. 



Abstract

In March 2026, two landmark jury verdicts delivered within twenty-four hours of each other fundamentally altered the legal landscape governing social media platforms in the United States. On 24 March 2026, a New Mexico jury ordered Meta Platforms to pay $375 million in civil penalties for violating state consumer protection law by misleading users about platform safety and enabling child sexual exploitation—the first time a U.S. state prevailed at trial against a major technology company on child safety grounds. The following day, a Los Angeles jury found Meta and Google's YouTube negligent in the design of their platforms, awarding $6 million in combined compensatory and punitive damages to a plaintiff who alleged that addictive design features had severely harmed her mental health during childhood. This article situates both verdicts within the broader trajectory of digital governance, analysing the legal architecture that enabled them—principally the erosion of Section 230 immunity through a conduct-versus-content distinction—and examines the socio-economic, regulatory, and public health implications that follow. Drawing on the tobacco litigation analogy that has come to frame public discourse on this issue, the article argues that while imperfect, the comparison captures an important structural dynamic: a phase transition in which decades of quasi-immunity gives way to escalating legal and regulatory accountability. The article further surveys concurrent legislative developments across Australia, Europe, and North American state legislatures, concluding that the March 2026 verdicts represent not an endpoint but the beginning of a prolonged reckoning with the design obligations of consumer-facing digital platforms.


Keywords: social media; product liability; Section 230; platform design; digital addiction; youth mental health; Big Tobacco analogy; KGM v. Meta; New Mexico v. Meta



1. INTRODUCTION: THE SOCIAL MEDIA ATTENTION ECONOMY AND THE ACCUMULATION OF HARM


Since the early 2000s, social media platforms have evolved from niche communication tools into foundational infrastructure for social, economic, and political life. Platforms including Facebook, Instagram, YouTube, and TikTok restructured the mechanics of human interaction by embedding them within an attention economy—a commercial model in which user engagement is monetised through targeted advertising and the continuous extraction of behavioural data. The longer users remain on a platform, the more data is generated and the more advertising revenue is produced. User retention therefore became the core engineering objective, shaping design decisions from the architecture of content feeds to the calibration of notification systems.


For most of their existence, these platforms operated within a legal environment that afforded them extraordinary protection from civil liability. Section 230 of the U.S. Communications Decency Act of 1996 shielded online intermediaries from legal responsibility for content generated by third-party users, providing a statutory foundation upon which the modern internet was constructed. For nearly two decades, social media companies successfully invoked this protection to defeat lawsuits, and the broader regulatory environment—particularly in the United States—remained permissive.


By the mid-2010s, however, a confluence of whistleblower disclosures, academic research, and congressional scrutiny began to document the costs of this arrangement. Internal research at Meta, published by the Wall Street Journal in 2021 and subsequently entered into court records, indicated that company researchers were aware that Instagram was associated with body image distress and depression among teenage girls, yet these findings did not produce meaningful design changes. The question of whether social media platforms caused, rather than merely correlated with, mental health decline in young people remained contested in the scientific literature. But within courtrooms and legislatures, the framework for assigning legal and regulatory responsibility was slowly shifting.


The present article examines the legal, regulatory, and socio-economic consequences of two jury verdicts delivered in U.S. courts on 24 and 25 March 2026—a pair of decisions that, taken together, constitute the most significant judicial challenge to social media platform immunity in the history of American law. It analyses the legal strategy that made these verdicts possible, surveys the rapidly evolving global regulatory landscape, and interrogates the widely invoked analogy between the present litigation and the tobacco industry's legal reckoning of the 1990s. The article concludes that, whatever their ultimate outcome on appeal, these verdicts mark the beginning of a structural transformation in digital governance.



2. THE MARCH 2026 VERDICTS: A CONVERGENT JUDICIAL CHALLENGE


2.1  New Mexico v. Meta Platforms (24 March 2026)


On 24 March 2026, a jury in the First Judicial District Court of Santa Fe, New Mexico concluded a six-week trial by finding Meta Platforms liable on all counts brought by the state's attorney general, Raúl Torrez. The case, filed in 2023, alleged that Meta had violated New Mexico's Unfair Practices Act by misleading consumers about the safety of its platforms—Facebook, Instagram, and WhatsApp—and by knowingly enabling child sexual exploitation through algorithmic features and inadequate protective mechanisms.


Central to the prosecution's evidence was an undercover investigation in which the attorney general's office created fictitious social media accounts representing users under the age of 14. Those accounts received unsolicited sexually explicit material and were contacted by adults seeking similar content, generating criminal referrals against multiple individuals. Prosecutors also presented internal Meta communications suggesting that a 2019 decision by CEO Mark Zuckerberg to implement end-to-end encryption on Facebook Messenger by default was understood internally as likely to impede the reporting of millions of instances of child sexual abuse material to law enforcement.


The jury found that Meta had engaged in 'unfair and deceptive' and 'unconscionable' trade practices under state law, imposing the maximum statutory penalty of $5,000 per violation. The total award—$375 million—represented thousands of individual violations and, while substantially less than the $2.1 billion the prosecution had sought, constituted the first time a U.S. state had prevailed at trial against a major technology company over child safety claims. New Mexico Attorney General Torrez declared the verdict 'a historic victory for every child and family who has paid the price for Meta's choice to put profits over kids' safety.' Meta stated that it 'respectfully disagrees with the verdict and will appeal.'


The case has a second phase, beginning in May 2026, in which Judge Bryan Biedscheid will consider the state's public nuisance argument and determine whether Meta should be required to implement structural changes to its platforms, including effective age verification and the removal of encryption features that impede law enforcement reporting.


2.2  KGM v. Meta Platforms and YouTube LLC (25 March 2026)


One day later, on 25 March 2026, a jury in Los Angeles County Superior Court delivered what may prove the more consequential verdict in doctrinal terms. After more than forty-three hours of deliberation across nine days, the jury found that Meta and Google's YouTube were negligent in the design or operation of their social media platforms, that this negligence was a substantial factor in causing harm to the plaintiff, and that the companies had failed to adequately warn users of those dangers. The jury further found, in a punitive damages phase conducted immediately afterward, that Meta and YouTube had acted with malice, oppression, or fraud in harming the plaintiff.


The plaintiff, a twenty-year-old California woman identified in court filings by her initials KGM—and referred to in proceedings as Kaley—testified that she began using YouTube at the age of six and Instagram at age nine. She described spending up to sixteen hours per day on the platforms at peak usage, experiencing an emotional 'rush' from likes and notifications, suffering from depression, body dysmorphia, and suicidal ideation, and continuing to feel compelled to use the platforms compulsively into adulthood. The original complaint also named TikTok and Snapchat parent company Snap Inc., both of which settled before trial on undisclosed terms.


The jury awarded $3 million in compensatory damages, apportioning 70% to Meta and 30% to Google. In the punitive phase, it recommended an additional $2.1 million against Meta and $900,000 against Google—figures subject to final judicial determination. The plaintiff's attorney, Mark Lanier, described the verdict as historically significant: 'A jury of Kaley's peers heard the evidence, heard what Meta and YouTube knew and when they knew it, and held them accountable for their conduct.'


Both companies announced plans to appeal. A Meta spokesperson stated that the company 'respectfully disagrees with the verdict' and that 'teen mental health is profoundly complex and cannot be linked to a single app.' Google's spokesperson characterised the verdict as one that 'misunderstands YouTube, which is a responsibly built streaming platform, not a social media site'—a characterisation that itself reflects the companies' continued resistance to the product liability framing adopted by plaintiffs.



3. THE LEGAL ARCHITECTURE OF PLATFORM LIABILITY: DISSOLVING THE SECTION 230 SHIELD


For nearly three decades, Section 230 of the Communications Decency Act functioned as the legal cornerstone of the commercial internet. Its operative provision—that 'no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider'—enabled platforms to host vast quantities of user-generated content without exposure to the publisher liability faced by traditional media. Courts interpreted the provision broadly, consistently dismissing suits premised on the harmful character of third-party content.


The legal innovation that made the March 2026 verdicts possible was the systematic reframing of platform harm as a product design question rather than a content question. Plaintiffs' counsel in the KGM case, and in the parallel multidistrict litigation (MDL No. 3047, In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, pending before Judge Yvonne Gonzalez Rogers in the Northern District of California), argued that features such as infinite scrolling, algorithmic autoplay, notification systems calibrated to maximise anxiety-driven re-engagement, and variable-reward mechanics analogous to slot machines were not neutral technological choices but deliberate product design decisions carrying independent safety obligations.


This framing was validated in a series of significant pre-trial rulings. In November 2023, Judge Gonzalez Rogers dismissed content-based claims under Section 230 but allowed design defect and failure-to-warn claims to proceed. In January 2025, a California judge rejected the social media companies' bid to dismiss failure-to-warn claims, ruling that Section 230 and the First Amendment did not shield them from liability for their own design choices. In November 2025, Judge Carolyn B. Kuhl of the Los Angeles Superior Court denied Meta's motion for summary judgment in the KGM case, distinguishing between features related to content publication—potentially protected by Section 230—and features such as notification timing, engagement loops, and the absence of meaningful parental controls, which she held could be evaluated by a jury without engaging the statute's immunity provision. As legal scholars have summarised, Kuhl's ruling established a 'conduct-versus-content distinction—treating algorithmic design choices as the company's own conduct rather than as the protected publication of third-party speech' as a viable legal theory.


In a parallel development, California's Ninth Circuit Court of Appeals has been considering California v. Meta, Inc. (No. 24-7032), in which a coalition of state attorneys general challenges Meta's invocation of Section 230. Their October 2025 reply brief argues that the states' claims focus 'solely on Meta's design choices, not the substance of third-party content'—a characterisation consistent with the emerging judicial consensus. An appellate ruling affirming this distinction would have implications extending beyond social media, potentially shaping liability exposure for other online platforms accessible to minors, including gaming services such as Roblox, which faces over 130 federal lawsuits on related grounds.


Legal experts have noted that the two March 2026 verdicts will now almost certainly become the vehicle for a definitive appellate ruling on the scope of Section 230. Gregory Dickinson of the University of Nebraska College of Law has observed that 'courts are increasingly trying to distinguish claims about platform functionality or platform conduct from claims that would really just impose liability for third-party speech.' Meetali Jain, Director of the Tech Justice Law Project, has suggested that the U.S. Supreme Court—which in 2023 declined to reach the merits of a Section 230 challenge involving YouTube—may now be more receptive to resolving the question.



4. ENGINEERED COMPULSION: THE SCIENCE AND EVIDENCE OF ADDICTIVE DESIGN


A central factual dispute in the KGM trial concerned the degree to which Meta and Google designed their platforms to foster compulsive use and whether they possessed contemporaneous knowledge of the risks this posed to young users. The evidentiary record introduced at trial illuminates both questions.


Internal Meta documents presented to the jury included a memorandum stating, 'If we wanna win big with teens, we must bring them in as tweens,' and data indicating that eleven-year-olds were four times more likely to return to Instagram than to competing platforms despite the service nominally requiring users to be at least thirteen years old. The jury also heard testimony regarding Meta's decision to deploy beauty filters that manipulate users' facial appearances despite concerns raised by eighteen internal experts and external consultants about their potential to cause body image harm. Zuckerberg testified that the minimum age rule is difficult to enforce because 'a meaningful number of people who lie about their age to use our services.'


Instagram head Adam Mosseri testified that he considers social media use capable of being 'problematic' but not 'clinically addictive'—a distinction with significant legal implications given that proof of clinical addiction would likely strengthen plaintiffs' causal arguments. YouTube's Vice President of Engineering, Cristos Goodrow, testified that his own children use YouTube for several hours daily and that he considers this 'good' for them, while simultaneously maintaining that the platform was 'not designed to maximize time.'


The legal characterisation of these design features as analogous to gambling mechanics—exploiting variable-reward psychology, removing natural stopping points, and calibrating notification timing to maximise anxiety-driven return visits—drew on a body of academic and clinical literature examining the neurological effects of social media use on adolescent brains. While the scientific question of whether social media causes, rather than correlates with, mental health harm in young people remains contested in the peer-reviewed literature, courts in both the KGM case and the federal MDL have held that this scientific uncertainty does not defeat plaintiffs' claims at the pleading stage; it is instead a question for juries to evaluate on the evidence.


The pivotal legal insight—and the one that distinguished this litigation from prior unsuccessful suits—was that proof of addiction was not required. Plaintiffs needed only to demonstrate that the platforms were negligently designed in ways that created foreseeable risks of harm to minor users, and that the companies had failed to provide adequate warnings. The jury in the KGM case found both elements established.



5. THE BIG TOBACCO ANALOGY: STRUCTURAL PARALLEL AND CONCEPTUAL LIMITS


The comparison between contemporary social media litigation and the tobacco industry's legal reckoning of the 1990s has become so prevalent in public discourse that it has been invoked by attorneys general, legal scholars, advocacy groups, and journalists covering both the KGM and New Mexico trials. The analogy is analytically useful but requires careful qualification.


The structural parallels are significant. Both industries generated extraordinary profits from products whose internal researchers had documented as potentially harmful to users, particularly younger ones. Both industries engaged in sustained efforts to contest, minimise, and delay public and regulatory recognition of those harms. Both relied on marketing and design strategies that were specifically oriented toward attracting and retaining younger users—practices documented in internal communications that were eventually surfaced through litigation. And both ultimately faced a litigation wave that proved impossible to contain through the legal immunities and scientific uncertainty arguments they had previously deployed successfully.


Peter Ormerod, an associate professor of law at Villanova University, has characterised the KGM verdict as 'a momentous development' while cautioning that it represents 'one step in a much longer saga.' He notes that a transformation comparable to the tobacco Master Settlement Agreement of 1998—which restructured industry practices, established a multi-decade compensation fund, and imposed marketing restrictions—would require repeated adverse verdicts at the bellwether stage, sustained losses on Section 230 appeals, and a corresponding collapse in the industry's appetite for further litigation. Sarah Kreps of Cornell University's Tech Policy Institute has similarly observed that 'once you have this type of verdict in one case, it just opens the floodgates for so many more.'


Yet the analogy has meaningful limits. Tobacco is a physically addictive product with well-characterised physiological mechanisms of harm; its commercial function is straightforwardly the delivery of nicotine. Social media platforms serve genuinely diverse social functions—facilitating communication, commerce, journalism, political participation, and education—and their effects vary substantially across users and contexts. A sixteen-year-old who experiences Instagram as a source of body image distress and a forty-year-old who uses it to sell handmade goods occupy very different relationships with the same product. This functional complexity complicates both the causation arguments available to plaintiffs and the remedial options available to courts and regulators.


Moreover, the tobacco litigation's ultimate resolution was facilitated by the existence of an alternative—people could simply stop smoking—whereas social media's integration into educational, occupational, and civic infrastructure means that prohibition-style remedies are neither practicable nor politically available. The regulatory imagination applicable to social media is therefore necessarily different in kind from the one that transformed tobacco: it is oriented toward design modification and disclosure rather than elimination.


Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University, has offered a more sweeping framing: 'I think the internet is on trial, not social media. If the theories work, they will be deployed elsewhere.' This observation captures an important dimension of the current litigation that the tobacco analogy risks obscuring: the legal principles at stake are not specific to any platform or industry but concern the liability exposure of the entire ecosystem of consumer-facing digital products.



6. THE BROADER LITIGATION LANDSCAPE: SCALE, PENDING CASES, AND STRATEGIC STAKES


The KGM case was structured by the Los Angeles Superior Court as a bellwether proceeding under California's Judicial Council Coordination Proceedings framework. Its outcome is designed to guide resolution of related cases consolidated in the state court system, which involve approximately 1,600 plaintiffs including more than 350 families and over 250 school districts. The case is therefore not primarily significant as an individual damages award—the $6 million total is negligible relative to Meta's annual revenues exceeding $160 billion—but as a liability template with potentially enormous aggregate consequences.


In the parallel federal litigation, MDL No. 3047 had grown to over 2,400 consolidated cases as of February 2026, including suits brought by school districts, local governments, and individual plaintiffs across the country. A federal bellwether trial involving school district plaintiffs is scheduled to commence in the Northern District of California in the summer of 2026. More than forty state attorneys general have filed suits against Meta alone, with California Attorney General Rob Bonta announcing, following the March 25 verdict, that the state 'looks forward to holding Meta accountable in our own upcoming August trial in the Bay Area.'


The financial risk profile of this litigation is transforming. Even if average per-case damages remain modest, the volume of pending cases means that aggregate liability could reach tens of billions of dollars if bellwether verdicts continue to go against the platforms. Following the two March 2026 verdicts, Meta's shares fell approximately 3% to their lowest level since May 2025, and Alphabet's shares declined approximately 1.5%. The market's response—muted relative to the symbolic severity of the verdicts—reflects uncertainty about the ultimate appellate outcome and the timeline of the litigation. As Section 230 appeals work through the courts, however, investor attention to platform liability risk is likely to intensify.



7. GLOBAL REGULATORY CONVERGENCE: LEGISLATIVE RESPONSES ACROSS JURISDICTIONS


The March 2026 verdicts did not occur in a regulatory vacuum. They coincided with—and have accelerated—a wave of legislative activity across multiple jurisdictions aimed at restricting or conditioning minors' access to social media, and at imposing design and transparency obligations on platforms.


Australia became the first country in the world to ban social media for children under the age of sixteen when the Online Safety Amendment (Social Media Minimum Age) Act 2024 came into force on 10 December 2025. The legislation applies to Facebook, Instagram, TikTok, Snapchat, YouTube, X, Reddit, Twitch, Threads, and Kick, and places compliance obligations on platforms rather than on users or parents. Companies face fines of up to AUD 50 million for failure to take reasonable steps to prevent under-sixteens from holding accounts. Within the law's first month of operation, major platforms had suspended approximately 4.7 million accounts belonging to Australian teenagers. The legislation faces legal challenges from Reddit and civil liberties organisations, with preliminary hearings scheduled in the High Court during 2026.


Australia's action has catalysed legislative movement across Europe and Asia. In late January 2026, the French National Assembly passed a bill banning social media for children under fifteen by a vote of 130 to 21; the legislation awaits Senate consideration and, if approved, is expected to take effect at the start of the 2026–2027 school year. The Danish government secured cross-party parliamentary support in November 2025 for a restriction on under-fifteens, with implementation expected by mid-2026. Germany's governing coalition has discussed a ban for under-sixteens, with an expert commission expected to report recommendations by the summer of 2026. Spain, Norway, Malaysia, Slovenia, and Portugal have each advanced comparable proposals. The European Parliament called on member states in November 2025 to prohibit social media access for children under thirteen.


Within the United States, the legislative response has proceeded at the state level. In 2025 alone, twenty states enacted new laws governing children's social media access, covering mechanisms including age verification, parental consent requirements, and restrictions on engagement-maximising features. California's SB-976 (the Protecting Our Kids from Social Media Addiction Act) prohibits platforms from delivering algorithmically addictive feeds to minors. Minnesota has enacted a law, taking effect in July 2026, requiring platforms to display mental health warning pop-ups before users can access social media services.


The convergence of litigation and legislation is not coincidental. Each reinforces the other: jury verdicts documenting platform knowledge of harm strengthen legislative arguments for precautionary regulation, while regulatory mandates create documented compliance obligations whose violation can support negligence claims in subsequent litigation. This dynamic, familiar from tobacco and opioid litigation, is now structuring the political economy of social media governance.



8. SOCIO-ECONOMIC IMPLICATIONS: PUBLIC HEALTH, INDUSTRY STRUCTURE, AND CORPORATE ACCOUNTABILITY


8.1  Public Health


The legal recognition of addictive design as a cognisable harm reframes social media from a lifestyle choice into a consumer product safety issue with public health dimensions. Governments and health authorities in several jurisdictions have begun treating excessive social media use among minors as analogous to other behavioural conditions requiring public health responses, prompting investments in digital literacy education, screen time guidance, and adolescent mental health services. The New Mexico trial provided particularly stark documentation of the connection between platform design and child sexual exploitation—a harm with undeniable public health and criminal justice dimensions that existing platform self-regulatory frameworks had failed to prevent.


8.2  The Attention Economy Under Pressure


The advertising industry is structurally dependent on social media engagement. If courts mandate the elimination or modification of the design features—infinite scrolling, autoplay, algorithmic recommendation loops—that drive user retention, this will reduce the total time users spend on platforms and, correspondingly, the advertising inventory available to sell. Estimates of the revenue impact are speculative, but the direction of effect is clear: platforms whose business models are premised on maximising engagement face a structural tension between commercial optimization and the design safety standards that courts and legislators are increasingly imposing. Affected parties would include not only platform shareholders but digital marketing agencies, content creators, and the small businesses that have migrated their advertising expenditures from traditional media to social media over the past decade.


8.3  Corporate Accountability and Investor Risk


The two March 2026 verdicts represent the first time that U.S. juries have found major social media companies liable for design choices at trial. The precedential value of bellwether verdicts means that each additional adverse ruling will narrow the platforms' negotiating position and reduce the credibility of their resistance to settlement. The tobacco and opioid litigation waves both reached inflection points at which the cumulative cost of litigation, reputational damage, and regulatory pressure made industry-wide settlement more economically rational than continued defence. Whether and when social media litigation reaches an equivalent inflection point will depend on the outcome of Section 230 appeals, the results of additional bellwether trials, and the willingness of legislatures to supplement judicial remedies with regulatory mandates.



9. CAUSATION, COMPLEXITY, AND THE LIMITS OF ANALOGICAL REASONING


The defendants' most durable legal argument—one that survived the verdict and will animate the inevitable appeals—is the complexity of causation in mental health cases. Meta argued throughout the KGM trial that the plaintiff's challenges predated and were independent of her social media use, and that a range of factors including family environment, pre-existing conditions, and social circumstances contributed to her mental health struggles. This argument reflects a genuine scientific difficulty: randomised controlled trials of social media exposure and mental health outcomes are logistically and ethically constrained, and the existing literature consists largely of observational studies whose causal interpretation is disputed.


Courts have addressed this difficulty through doctrinal mechanisms—in particular, the 'substantial factor' causation standard applied in the KGM case—that do not require plaintiffs to establish that social media was the sole or even primary cause of their harm, only that it was a material contributing factor. But on appeal, defendants will argue that this standard was applied too permissively and that the jury was permitted to find causation on inadequate evidence. The outcome of these appeals will partly determine whether the KGM verdict functions as the first in a long series of adverse judgments or as an outlier corrected by higher courts.


The scientific uncertainty also complicates the global legislative response. A 2025 peer-reviewed analysis published in Child and Adolescent Mental Health by researchers at the University of Sydney concluded that 'it remains unclear whether social media causes poor mental health in youth, or whether the association is bi-directional or influenced by other factors' and called for 'more robust longitudinal research.' Australia's pioneering ban was enacted explicitly in advance of a settled scientific consensus, reflecting a precautionary regulatory logic that its Health Minister compared to seatbelt mandates and tobacco warnings. This regulatory posture—intervening on the basis of plausible risk before causal certainty is established—represents a significant departure from the evidentiary standards traditionally required for consumer product regulation.



10. CONCLUSION: THE BEGINNING OF ACCOUNTABILITY, NOT ITS END


The two jury verdicts delivered on 24 and 25 March 2026 are best understood not as the conclusion of a legal campaign but as its most visible milestone to date. In New Mexico, a state government for the first time held a major social media platform accountable at trial for endangering children and misleading consumers. In Los Angeles, a jury for the first time found that the architectural design of a social media platform constituted a negligently defective consumer product. Together, these verdicts validate a decade of litigation strategy, provide a legal roadmap for thousands of pending cases, and signal to platforms, insurers, and investors that the era of uncontested immunity has ended.


Whether the tobacco analogy ultimately proves apt depends on developments that remain uncertain: the outcome of Section 230 appeals, the performance of platforms in subsequent bellwether trials, the willingness of courts to accept complex causal narratives, and the capacity of global legislatures to convert public concern into durable regulatory architecture. The analogy captures the structural dynamic of the moment—a phase transition from quasi-immunity to escalating accountability—but obscures the genuine complexity of social media's role in modern life and the institutional creativity required to govern it appropriately.


What is clear is that the question confronting courts, regulators, and the platforms themselves is no longer whether social media design choices carry legal and ethical obligations, but what those obligations are and how they will be enforced. The March 2026 verdicts have ensured that this question will be answered, however imperfectly, by the legal system. The transformation of social media governance is underway.



REFERENCES


Benesch Law. (2026, January 9). The intersection of social media, AI, and product liability. https://www.beneschlaw.com/resources/the-intersection-of-social-media-ai-and-product-liability.html


Champion, K. E., Birrell, L., & Teesson, M. (2025). Debate: Social media in children and young people — time for a ban? Beyond the ban: Empowering parents and schools to keep adolescents safe on social media. Child and Adolescent Mental Health, 30(4), 411–413. https://doi.org/10.1111/camh.70032


CBS News. (2026, March 25). Meta and YouTube found liable on all charges in landmark social media addiction trial. https://www.cbsnews.com/news/meta-youtube-social-media-addiction-lawsuit-verdict/


CNN. (2026, March 24). Jury finds Meta liable in case over child sexual exploitation on its platforms. https://www.cnn.com/2026/03/24/tech/meta-new-mexico-trial-jury-deliberation


CNN. (2026, March 25). Meta and YouTube found liable on all counts in landmark social media addiction trial. https://www.cnn.com/2026/03/25/tech/social-media-addiction-trial-jury-decision


CNBC. (2026, March 24). Meta must pay $375 million for violating New Mexico law in child exploitation case, jury rules. https://www.cnbc.com/2026/03/24/jury-reaches-verdict-in-meta-child-safety-trial-in-new-mexico.html


CNBC. (2026, March 25). Jury in Los Angeles finds Meta, YouTube negligent in social media addiction trial. https://www.cnbc.com/2026/03/25/meta-youtube-los-angeles-california-verdict.html


Detroit News / Reuters. (2026, March 26). Verdicts against Meta, Google tee up fight over tech liability shield. https://eu.detroitnews.com/story/tech/2026/03/26/social-media-verdicts-section-230/89330275007/


Goldman, E. (2026, March 26). Comment cited in: Verdicts against Meta, Google tee up fight over tech liability shield. Detroit News / Reuters.


McGuireWoods. (2026, March). Can social media or AI be a defective product? Product Liability & Mass Tort Monitor. https://www.mcguirewoods.com/client-resources/alerts/2026/3/can-social-media-or-ai-be-a-defective-product/


NBC News. (2026, March 24). Meta ordered to pay $375 million in New Mexico trial over child exploitation, user safety claims. https://www.nbcnews.com/tech/social-media/jury-orders-meta-pay-375-million-new-mexico-lawsuit-child-sexual-explo-rcna265002


NBC News. (2026, March 25). Verdict reached in landmark social media addiction trial. https://www.nbcnews.com/tech/tech-news/verdict-reached-landmark-social-media-addiction-trial-rcna263421


New Mexico Department of Justice. (2026, March 24). New Mexico Department of Justice wins landmark verdict against Meta. https://nmdoj.gov/press-release/new-mexico-department-of-justice-wins-landmark-verdict-against-meta/


NPR. (2026, March 24). New Mexico jury says Meta harms children's mental health and safety, violating state law. https://www.npr.org/2026/03/24/g-s1-115019/new-mexico-meta-children-mental-health


NPR. (2026, March 25). Jury finds Meta and Google negligent in social media harms trial. https://www.npr.org/2026/03/25/nx-s1-5746125/meta-youtube-social-media-trial-verdict


Ormerod, P. (2026, March 25). Comment cited in: Los Angeles social media addiction trial: Jury finds Meta and YouTube liable. ABC7 Los Angeles. https://abc7.com/post/los-angeles-social-media-addiction-trial-jury-finds-instagram-youtube-liable-landmark-court-case/18771272/


Rossini, C. (2026, March). How the Instagram addictiveness lawsuit could reshape social media: Platform design meets product liability. The Conversation. https://theconversation.com/how-instagram-addictiveness-lawsuit-could-reshape-social-media-platform-design-meets-product-liability-277066


Source New Mexico. (2026, March 24). Santa Fe jury awards New Mexico $375M in Meta child exploitation case. https://sourcenm.com/2026/03/24/santa-fe-jury-awards-new-mexico-375m-in-meta-child-exploitation-case/


TechCrunch. (2026, March 6). These are the countries moving to ban social media for children. https://techcrunch.com/2026/03/06/social-media-ban-children-countries-list/


UC Law Review Blog. (2025, December 4). Addicted by design: Reassessing Section 230 in the new era of social media addiction litigation. University of Cincinnati Law Review. https://uclawreview.org/2025/12/04/addicted-by-design-reassessing-section-230-in-the-new-era-of-social-media-addiction-litigation/


Washington Post. (2026, March 25). Meta, YouTube found negligent in landmark social media addiction trial. https://www.washingtonpost.com/technology/2026/03/25/meta-youtube-verdict-social-media-addiction/


In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, MDL No. 3047, 4:22-md-03047-YGR (N.D. Cal.).


KGM v. Meta Platforms Inc. & YouTube LLC, Case No. JCCP 5255, Los Angeles County Superior Court (verdict delivered 25 March 2026).


State of New Mexico v. Meta Platforms, Inc., No. D-101-CV-2023-02638, First Judicial District Court, Santa Fe (verdict delivered 24 March 2026).


California v. Meta Platforms, Inc., No. 24-7032 (9th Cir., pending).


Communications Decency Act of 1996, 47 U.S.C. § 230.


Online Safety Amendment (Social Media Minimum Age) Act 2024 (Cth) (Australia) (entered into force 10 December 2025).


No comments:

Post a Comment