In the earliest of the social media addiction trials that began on 28 January in the United States, Snapchat and TikTok have opted to settle rather than have detrimental information about their products exposed to public scrutiny. Experts are comparing the spate of cases against the social media giants to the Big Tobacco trials that resulted in multi-billion-dollar payouts and changes to cigarette packaging.
TikTok agreed to settle with the plaintiff just prior to the trial beginning, according to Matthew Bergman, the founding attorney of the Social Media Victims Law Center, which is representing the plaintiff, 19-year-old “KGM”.
The platform, along with Meta’s Instagram and Google’s YouTube, were on the pointy end of claims that their products deliberately foster social media addiction in children, causing harm. Snapchat had settled a week earlier, on 20 January, for an undisclosed sum.
The trial against Meta and YouTube was scheduled to begin with jury selection on 28 January (Australian time) in the California Superior Court in Los Angeles.
The case, in which KGM claims she became addicted to the platforms from a young age because of their deliberately attention-grabbing design, is considered a test case for many more trials to come. KGM has alleged her suicidal thoughts and depression resulted directly from engagement with the apps.
Joseph VanZandt, co-lead counsel for the plaintiff, said in a statement on 28 January that TikTok remains a defendant in the other personal injury cases, and that the trial will proceed as scheduled against Meta and YouTube.
“Plaintiffs are not merely the collateral damage of Defendants’ products,” the lawsuit says. “They are the direct victims of the intentional product design choices made by each Defendant. They are the intended targets of the harmful features that pushed them into self-destructive feedback loops.”
The lawsuit also says the social media companies borrowed “heavily from the behavioural and neurobiological techniques used by slot machines and exploited by the cigarette industry”.
Product design features
It also states: “Defendants deliberately embedded in their products an array of design features aimed at maximising youth engagement to drive advertising revenue.”
According to an AP/Reuters report, a Meta spokesperson said in a statement on 27 January, the company strongly disagreed with the allegations outlined in the lawsuit and that it was “confident the evidence will show our longstanding commitment to supporting young people”.
José Castañeda, a Google spokesperson said the allegations against YouTube were “simply not true”.
In a statement, he said “providing young people with a safer, healthier experience has always been core to our work”.
According to The Guardian, approximately 1,600 plaintiffs are included in the series of proceedings, involving more than 350 families and 250 school districts, in which families will claim Meta, Snap, TikTok and YouTube harm children.
The KGM case is the first of about 22 “bellwether” trials to ascertain how juries respond and what verdicts are reached. The landmark trials have been coordinated as part of a judicial council coordination proceeding (JCCP), which will cover thousands of lawsuits.
The claimants are seeking monetary damages, and redesign of the platforms to mitigate for their addictive nature and associated harms to children and teenagers. To that end, the parallels with Big Tobacco are clear. The 1998 Master Settlement Agreement (MSA) in the US, and as recently as 2025/2026 in Canada ($34.168 billion in AUD), major tobacco companies have settled for billions and been required to run ads revealing the harms of tobacco use.
The Justice Department filed a lawsuit in 1999, and a landmark judgment issued in August 2006 by US District Judge Gladys Kessler found Altria, R.J. Reynolds Tobacco, Lorillard, and Philip Morris USA to be in violation of civil racketeering (RICO) laws. The federal courts ruled that the companies systematically defrauded the American people by deceiving the public for decades about the health effects of smoking and their marketing to children.
The unsealed documents in the social media cases are expected to reveal details around the product design and the intentions of features to encourage “doomscrolling”, autoplay of videos, and feeding users customised content to maintain their attention via secretive algorithms.
As reported in The Guardian, Julia Duncan, an attorney with the American Association for Justice, said one unsealed document shows an Instagram employee calling the app a “drug” and another employee saying, “lol, we’re basically pushers”.
In New Mexico, jury selection began in the final week of January for trial on allegations that Meta and its social media platforms have failed to protect young users from sexual exploitation, following an undercover online investigation. In late 2023, Attorney General Raúl Torrez sued Meta and Zuckerberg, though Zuckerberg was later dropped from the suit.
Prosecutors have claimed that New Mexico is not seeking to hold Meta accountable for its content, but the product design that involves complex algorithms that discover and share material that can be harmful. Under Section 230 of the Communications Decency Act, publishers are not liable for content generated by users. Prosecutors have said they uncovered internal documents in which Meta employees estimate about 100,000 children every day are subjected to sexual harassment on the company’s platforms (as reported by CNBC).
Statutory duty of care proposed in Australia
Conducted by Delia Rickard PSM, the Statutory Review of the Online Safety Act 2021, released in early 2025, recommended implementing a “singular and overarching digital duty of care” for online service providers in Australia.
“If the requirement is breached, a regulator and/or a court can impose some form of sanction …”
Dr Karen Lee is a Senior Lecturer who specialises in communications regulation. She says, “A statutory duty of care involves the imposition of a requirement in legislation that duty holders, such as social media platforms, must meet a particular standard … If the requirement is breached, a regulator and/or a court can impose some form of sanction, like an order requiring compliance, financial penalties, imprisonment etcetera.”
Lee points to examples where this is in effect.
“In Australia, statutory duties of care have been imposed in environmental and occupational health and safety legislation. For example, the NSW’s Work Health and Safety Act 2011 requires businesses to ‘ensure, so far as is reasonably practicable, the health and safety’ of their workers (s 19). They are not a new regulatory tool.”
Australian protections set out in the Online Safety Act 2021 (Cth)
Lee says, “Without knowing the detail of how the government intends to implement the duty – something it has promised to do – it is difficult to determine the precise impact. However, it marks an important addition to a regulatory regime that is largely complaints driven and turns in part on a set of ‘expectations’ and industry codes, which are not directly enforceable by the regulator. The eSafety Commissioner can ask platforms to report on how they are meeting the expectations with consequences for failure to report; or issue a ‘statement’ a platform has breached an expectation. If a code is breached, she can issue platforms with a formal warning or direct them to comply with it. However, these ‘remedies’ fall well short of what the general public would expect.”
Is similar litigation possible in Australia?
“Individuals could bring lawsuits in negligence against specific platforms,” Lee posits. “The more difficult and important question is whether they would win them. There are a heap of issues to consider – is a duty owed, was the duty breached, did the breach cause the plaintiffs harm, is the harm experienced recognised as one for which compensation may be awarded? – and any decision reached by a court would be based on the specific facts. None of the answers to these questions is easy.”
For damages to be awarded, much like the Plaintiffs are seeking in the US, Lee suggests that a tort lawyer could provide the specifics, however, she says, “The aim of tort law is to put parties in the same position they would have been in but for the negligent act of the defendant. Money may need to be paid, but it isn’t a penalty as such. But to the extent that platforms can be held liable in tort, money and the exposure of poor business practices during litigation are certainly some ways to incentivise them to alter their behaviour.”
Additional legislative reform remains necessary, she asserts.
“Court processes are notoriously slow to deliver redress. If it comes at all, it can be awarded years after the fact. Look at the way US tobacco companies fought cancer claims for decades. There are no easy answers here, and I would caution against adopting a ‘binary’ approach. The two options can possibly be pursued simultaneously. Platform activities are so vast and wide ranging, a suite of measures are needed across a number of areas of law – not just online safety legislation. Tougher privacy, competition, and consumer protection law will also play an important role.”
Parallels between Big Tobacco and Big Tech
In the US, plaintiffs are arguing that social media apps are designed to be addictive, akin to cigarettes, and that profits have been prioritised over the mental health of young people by social media platforms.
“… they reframed the harm from one of individual choice … to one of systematic deception where the industry knew of the harm and proceeded anyway.”
Alexandra Jones, an expert in Global Health Law, and presently the Program Lead, Food Governance, Food Policy in the Faculty of Medicine, UNSW, says, “Public health law provides examples of where individuals (and groups of individuals) have challenged corporations for harming their health. Tobacco litigation is one of the biggest examples of where this has been done effectively – cases like the Master Settlement Agreement in the US were successful because they reframed the harm from one of individual choice (e.g. a smoker choosing to smoke) to one of systematic deception where the industry knew of the harm and proceeded anyway. There were also resulting health and medical costs incurred by governments as a result of this behaviour.”
She adds, “There are signs that social media cases are drawing lessons from this – particularly in arguments that harm arises from intentional product design (e.g. infinite scroll, algorithmic amplification, notifications), in claims that focus on youth targeting, and foreseeable population-level harm. Looking at these population level impacts takes cases away from a focus on individual harm and potentially on individuals’ particular vulnerability to harm.”
Twenty-five years ago, Melbourne woman Rolah McCabe launched a landmark lawsuit against a major tobacco company. It was the first Australian case, following a spate of successful American cases. In the Supreme Court of Victoria, McCabe brought a personal injury claim against British American Tobacco (McCabe v British American Tobacco).
In April 2002, 51-year-old McCabe was awarded $700,000 when Justice Geoff Eames found that British American Tobacco (BAT) had deliberately destroyed documents to try to sabotage McCabe’s legal action. The win was bittersweet, as McCabe died of lung cancer seven months later, and Eames’ decision was reversed on appeal the same year.
McCabe had begun smoking aged 12, and her allegation was that BAT had been negligent in its marketing and manufacturing of cigarettes, ultimately causing her lung cancer. The premise was that BAT knew that their product was addictive and dangerous to health; targeted children with its advertising; and knowingly mislead the public on research that revealed evidence of risks to health from smoking.
Following the Court of Appeal having overturned the judgment that had been made in Rolah’s favour, the case returned to trial and remained in the courts until it was settled confidentially in March 2011. Pursuantly, the US Department of Justice used evidence in McCabe’s case in an anti-racketeering case against the US tobacco industry.
