Regulatory movement
From Likes to Lawsuits: Social Media’s Legal Challenges
Click each term to find out more
UK & Europe
Tech & AI evolution
As scrutiny of social media platforms continues to intensify, this article provides an update to our December 2024 insight, which examined the growing legal and regulatory challenges arising from concerns about social media’s impact on children and key considerations for insurers.
View the December 2024 insight here
Since our last publication, the challenges facing social media companies continue to grow, not only through continued litigation but the prospect that these allegations may become relevant to a wider class of defendants beyond social media companies, particularly through the rapid rise of artificial intelligence (“AI”) technologies into digital platforms. This update explores how these developments are shaping the future of social media liability, including the implications for insurers.
In our last article, we discussed how social media platforms, Meta, TikTok, Snap, and YouTube, are facing claims consolidated in a Multi-District Litigation No. 3047 (“MDL”) in the Northern District of California.[1] The MDL centres on allegations that these platforms contribute to a range of harmful behaviours in children, from suicidal ideation to eating disorders, through intentionally addictive design features and exposure to harmful content.
The MDL continues to grow, exceeding 2,200 cases as of January 2026. In June 2025, Judge Yvonne Gonzalez Rogers confirmed an initial bellwether pool of 11 cases, six by school districts and five by families, for the first social media addiction bellwether trials, now scheduled for summer 2026.
In parallel to the federal MDL, related claims are proceeding in a Judicial Council Coordination Proceeding (“JCCP”) before Judge Carolyn B. Kuhl in Los Angeles County Superior Court. On the eve of jury selection in the first coordinated trial in that state-court action, TikTok and Snap reached settlements in principle with the plaintiff, identified as K.G.M.[2] Those settlements are limited to K.G.M.’s claims and do not dispose of the roughly 1,000 remaining cases in the JCCP. However, these settlements mean that Snap and TikTok’s CEO’s will no longer be called to testify in the trial and focus will narrow on remaining defendants, Meta and YouTube. Although the JCCP is procedurally distinct from the federal MDL, the state court proceedings may provide an early indication of how the social media defendants evaluate their litigation risk as the MDL bellwether trials approach.
Notably, in a November 2025 filing, school districts alleged that Meta supressed internal research showing that the mental health of young users suffered from compulsive use of its social media platforms, even as some employees likened the company’s practices to those of drug pushers.[3] According to the filing, Meta terminated an internal study, dubbed Project Mercury, after early results suggested that users who stopped using Facebook and Instagram for just a week reported reduced feelings of depression, anxiety and social comparison.
Plaintiffs refer to internal communications in which a Meta employee warned that withholding negative results could draw comparisons with tobacco companies concealing evidence of harm.
Whilst Meta disputes the characterisation of both the study and its decision to end it, these allegations have intensified arguments that the company had knowledge of potential harms to children and teenagers whilst continuing to design and market products targeting young users.
Further pressure has been placed on Meta following a ruling allowing a former Meta researcher to testify in the MDL, despite Meta’s objections.[4] John Sattizahn is expected to testify that Meta directed internal researchers to alter study protocols to avoid documenting evidence of harm to minors, a revelation with potentially significant implications for both the underlying personal injury claims and related insurance coverage disputes.[5]
State-level enforcement actions have also accelerated. In Hawaii, the state filed a suit against TikTok’s parent company, ByteDance, alleging that the platform deliberately designed features to cause children to be addicted, thereby violating the Children’s Online Privacy Protection Act (“COPPA”).[6] The complaint relies on statements from former employees who describe “coercive design tactics” akin to gambling industry methods, highlighting that the platform’s short-form video algorithm maximises user engagement at the expense of child safety. Hawaii further alleges that TikTok continued to collect personal data from underage users despite knowledge of their ages, echoing previous federal COPPA violations pursued by the Federal Trade Commission in 2019 and 2024.[7]
In Massachusetts, the state Supreme Judicial Court heard oral arguments in December 2025 regarding whether Instagram’s autoplay, ephemeral postings, and incessant notifications place Meta outside the protective scope of Section 230 of the Communications Decency Act. Justices debated whether these features constitute advertising rather than publishing, a distinction that could expose Meta to liability for encouraging minors’ engagement.[8]
As highlighted in our December 2024 article, insurers have moved swiftly to test whether claims arising from alleged youth harms linked to social media platforms engage cover under traditional liability policies.
In a 2024 Delaware state court action, Hartford Casualty Insurance Co. et al. v. Instagram LLC et al.[9], we saw insurers seek declaratory relief that they owe no duty to defend or indemnify Meta in the MDL. Insurers argued that the underlying claims do not allege covered “bodily injury” or “personal and advertising injury”, but instead arise from intentional design and business decisions aimed at maximising user engagement. They further argue that the MDL plaintiffs seek recovery for broad economic and societal harms, rather than damages “because of” injury to identifiable individuals.
Meta disputes that characterisation and has argued that the insurers are seeking to resolve coverage issues prematurely, before liability has been established in the underlying proceedings. In December 2024, Meta filed a parallel insurance coverage lawsuit in the Northern District of California, seeking to compel a defence from its insurers and align its coverage action with the larger social media MDL. However, in May 2025, the California federal court determined that the insurers’ earlier Delaware action took precedence, remanding the coverage case back to Delaware and dismissing the California action. As of December 2025, the coverage dispute remains pending in Delaware and no final outcome on the coverage issues has yet been reached.[10] As litigation grows, we can expect insurers to continue to challenge cover under liability policies where the claims relate to public nuisance, rather than “bodily injury” or “personal and advertising injury”.
Insurers should also be aware that the use of revenue generating platforms with addictive features are not limited to social media companies. Online or video games creators have similarly faced scrutiny that they have developed features that are addictive (e.g. in-game rewards and other tactics to encourage spending) which harm young users.
Regulators are beginning to take action against these companies and lawsuits have emerged which allege harms beyond those alleged in the MDL. For instance, there are also:
In light of these allegations and growing scrutiny from regulators, insurers will no doubt examine whether social media and gaming companies knew but ignored the dangers around their respective platform features that were potentially exploitative, addictive and/or facilitated predatory behaviour resulting in harm to young users. If these harms were known, insurers could seek to exclude liability on the basis that the injuries and losses now claimed were expected or intended, where such exclusions are available (for example in CGL or Bermuda Form policies).
As referenced in our last article, claims linked to engagement with digital platforms can also extend beyond traditional social media. Litigation has continued to expand to AI technologies, particularly large language models.
Recently, a wrongful death lawsuit was filed in December 2025 in California against OpenAI and Microsoft alleging that ChatGPT exacerbated a mentally ill man’s paranoid delusions and contributed to him killing his 83 year old mother before taking his own life. [13] The complaint alleges the AI validated and reinforced dangerous beliefs without directing the user toward real world help. This action is understood to be the first wrongful death lawsuit directly linking an AI chatbot to a homicide.[14]
We also note that on 9 December 2025, a bipartisan coalition of US state attorneys general warned major AI developers, including Microsoft, Meta, Google, Apple, and OpenAI, that outputs from AI chatbots may violate state laws and pose serious mental health risks, especially to children and vulnerable users.[15] The AGs called for independent audits and stronger safeguards for so‑called “delusional” or harmful AI outputs.
Taken together, these developments suggest that courts and regulators may begin to apply principles developed in social media design liability cases to generative AI systems. If courts hold AI companies liable for personal injuries or wrongful deaths allegedly tied to chatbot interactions, the implications could extend well beyond social media, potentially affecting developers of chatbots, productivity tools, and other AI‑driven interactive platforms.
The actions referenced above highlight an increasingly complex liability landscape. As courts examine whether algorithmic features and internal knowledge of harms create liability, insurers must closely monitor evolving coverage disputes, emerging claims, and consider tailored policy provisions or endorsements to manage systemic risk.
[1] In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation (4:22-md-03047) (Northern District of California).
[3] Plaintiffs’ Omnibus Opposition to Defendants’ Motions for Summary Judgment Case, Document 2480 (Filed 21 November 2025) In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation (4:22-md-03047-YGR)
[6] Hawaii v. Bytedance Inc.,(1CCV-25-0001964) (Circuit Court in the First Circuit of Hawaii).
[7] Press Release: FTC Investigation Leads to Lawsuit Against TikTok and ByteDance for Flagrantly Violating Children’s Privacy Law (2 August 2024) – Federal Trade Commission
[8] Commonwealth v. Meta Platforms Inc. et al., (SJC-13747) (Massachusetts Supreme Judicial Court).
[9] Hartford Casualty Insurance Co. et al. v. Instagram LLC et al., (N24C-11-010) (Delaware Superior Court).
[11] Rosalind "Ros" Dowey et al. v. Meta Platforms Inc. et al., (N25C-12-250) (Superior Court of the State of Delaware).
[12] Press Release: Genshin Impact Game Developer Banned From Selling Lootboxes To Under 16s Without Parental Consent – Federal Trade Commission
[13] First County Bank v. Open AI Foundation et al., case number not available, (Superior Court of the State of California, County of San Francisco).
End