Meta chief executive Mark Zuckerberg faced sustained questioning in a Los Angeles courtroom on Wednesday as a landmark trial over social media harms moved into a pivotal phase, with plaintiffs arguing that platforms such as Instagram were deliberately designed to keep young users hooked despite internal warnings.
The case, being heard in Los Angeles Superior Court, is one of several major lawsuits this year that experts have likened to the industry’s “Big Tobacco” moment. At its core is a claim brought by a 20-year-old woman, known in court filings as KGM, who says her compulsive use of Instagram and YouTube worsened her depression and suicidal thoughts. Her lawsuit is one of about 20 bellwether cases being used to test how juries respond to arguments about harmful product design rather than individual content.
Zuckerberg questioned on underage Instagram users and age verification failures
A central focus of Wednesday’s testimony was whether Meta took adequate steps to keep children under 13 off Instagram, which requires users to be at least 13.
Zuckerberg said the company had improved its ability to detect underage users, but conceded it had not moved fast enough. “I always wish that we could have gotten there sooner,” he told the court.
He said some users lie about their age when joining Instagram and that Meta removes accounts it identifies as underage. Plaintiffs’ lawyers challenged the credibility of that system, arguing that the company relied too heavily on formal policies rather than enforceable barriers.
“You expect a nine-year-old to read all of the fine print? That’s your basis for swearing under oath that children under 13 are not allowed?” a lawyer asked.
After repeated questioning about age verification, Zuckerberg responded: “I don’t see why this is so complicated.”
Meta chief says company responsibility should extend to users’ wellbeing
As the plaintiffs attempted to frame Meta’s products as a public health issue rather than a consumer preference, Zuckerberg was asked about what obligations a technology company owes its users.
“I think a reasonable company should try to help the people that use its services,” he said.
The statement echoed a broader theme running through the trial: that social media companies, like manufacturers in earlier waves of litigation, should be held accountable not merely for what appears on their platforms, but for the way those platforms are built — and the behaviours they encourage.
Beauty filters: Zuckerberg says Meta consulted stakeholders but prioritised free expression
The testimony also returned to a long-running debate inside Meta over Instagram’s beauty filters, which critics say contribute to distorted self-image and anxiety among young users.
Zuckerberg said Meta consulted “various stakeholders” about the use of the filters, though he did not name them.
Plaintiffs’ lawyers questioned him about internal messages suggesting he lifted a ban on certain filters because he believed the restriction was excessive.
“It sounds like something I would say and something I feel,” Zuckerberg replied. “It feels a little overbearing.”
He was pressed on why the company allowed the feature even after receiving guidance from experts that beauty filters had negative effects, particularly on young girls.
Lawyers referenced a University of Chicago study in which 18 experts said beauty filters as a feature cause harm to teenage girls. Zuckerberg said he saw the feedback and discussed it internally, but said the decision ultimately came down to free expression.
Engagement targets and internal metrics: Zuckerberg disputes “company goals”
The trial has also scrutinised whether Meta set explicit objectives to increase time spent on Instagram, an allegation at the heart of claims that the platform was engineered for addiction.
Zuckerberg pushed back on the notion that increasing engagement was a company goal. He was questioned about a 2015 email thread in which he appeared to highlight improving engagement metrics as an urgent matter. Zuckerberg said that while the email chain may have contained the words “company goals”, the comments could have been aspirational, and he insisted Meta does not have those objectives.
Plaintiffs later introduced evidence from Instagram chief Adam Mosseri that included goals to raise daily engagement time to 40 minutes in 2023 and to 46 minutes in 2026.
Zuckerberg said Meta uses internal milestones to measure itself against competitors and “deliver the results we want to see”, maintaining that the company is building services to help people connect.
Judge warns against recording testimony with AI smart glasses
Courtroom decorum became a flashpoint after Judge Carolyn B. Kuhl warned that anyone recording Zuckerberg’s testimony using AI smart glasses would be held in contempt.
“If you have done that, you must delete that, or you will be held in contempt of the court,” the judge said. “This is very serious.”
The warning came after members of Zuckerberg’s security detail were photographed wearing Meta Ray-Ban artificial intelligence glasses outside the courtroom. Recording is not allowed in the court.
Board control and media awkwardness: Zuckerberg revisits old remarks
Lawyers also questioned Zuckerberg about his prior statements suggesting the Meta board could not meaningfully remove him because of his voting power.
“If the board wants to fire me, I could elect a new board and reinstate myself,” he said, referring to remarks made on Joe Rogan’s podcast.
He also acknowledged his discomfort under public questioning.
“I think I’m actually well-known to be sort of bad at this,” he said.
Zuckerberg told the courtroom he is “very bad” at media.
A case built on design, not content, seeks to bypass tech’s traditional legal shield
The trial marks the first time Zuckerberg has faced a jury at a civil trial over child safety concerns. For years, technology companies have leaned on federal protections that largely shield them from liability for user-posted content.
Plaintiffs in this litigation have pursued a different strategy. Their argument is not primarily about individual posts or videos, but about product design — features they say were intended to maximise engagement, reward compulsive use, and keep users scrolling.
That approach has so far allowed the cases to sidestep the industry’s most familiar legal defence.
The bellwether cases and what is at stake for the tech industry
The Los Angeles case involving KGM is one of roughly 20 bellwether lawsuits designed to gauge jury reactions before hundreds of similar claims proceed.
TikTok and Snap settled in the initial trial but remain defendants in other cases tied to the broader litigation.
Zuckerberg’s testimony came about a week after Mosseri appeared on the stand. Mosseri pushed back on the science behind social media addiction, saying users could not be “clinically addicted”. He described children’s high usage of Instagram as “problematic use”, comparable to “watching TV for longer than you feel good about”.
While psychologists do not classify social media addiction as an official diagnosis, researchers have documented the harmful consequences of compulsive use among young people, and lawmakers globally have raised concerns about addictive design.
Meta disputes the role Instagram played in KGM’s mental health
Meta’s defence has sought to acknowledge KGM’s mental health struggles while disputing that Instagram played a significant role in exacerbating them.
Paul Schmidt, one of Meta’s attorneys, said in an earlier opening statement that the company accepted KGM’s mental health issues but argued that Instagram was not the primary driver. Schmidt cited medical records suggesting the central issue was a difficult home life.
Families and advocates say court may deliver what Congress has not
The litigation is being closely watched by families who argue that legislative action has stalled despite years of hearings and public scrutiny.
Two years ago, Zuckerberg faced similar questions during a tense congressional hearing on child exploitation. In January 2024, he turned towards grieving parents and apologised, promising continued investment to protect children.
Some families remain unconvinced.
“His apology – if you will call it that – was mostly empty,” said John DeMay, whose 17-year-old son Jordan died by suicide in 2022, hours after being targeted in an online sextortion scam on Instagram. “He basically said they’re doing everything they can to stop and prevent this stuff from happening and unfortunately that’s just not the case.”
DeMay, who has travelled frequently to Washington to advocate for online child safety, said he now has greater faith in the courts than in Congress.
“I’m hopeful that this case prevails but if it doesn’t, we still won because we showed the world – with on the record evidence – that they’re doing one thing and saying another,” he said.
Meta faces other lawsuits as child safety claims spread across states
Meta is also fighting separate litigation in New Mexico, where prosecutors accuse the company of violating consumer protection laws by failing to disclose what it knew about potential harms to children. Meta has denied the allegations.
Instagram has added safety features in recent years aimed at younger users, but advocacy groups argue those tools remain inconsistent.
A 2025 review by Fairplay, a non-profit focused on reducing the influence of big technology on children, concluded “that less than one in five are fully functional and two-thirds (64%) are either substantially ineffective or no longer exist”.
Former employees have also raised concerns about the company’s internal culture. Kelly Stonelake, a former Meta employee, said she left on medical leave in February 2023 after harassment and retaliation for raising child safety concerns. She sued Meta last year, alleging a pattern of silencing women.
She alleges Meta was collecting data on children without parental consent and exposing them to other adults and “an environment that we knew was riddled with harassment and bullying”.
Why the trial is being called social media’s “Big Tobacco” moment
The phrase has emerged as shorthand for a legal and political reckoning: an attempt to establish that social media companies, like tobacco firms in earlier decades, knew their products could harm users but failed to act decisively.
For Meta, Zuckerberg’s appearance placed the company’s internal deliberations — and the unresolved tension between free expression, safety, and commercial incentives — under a level of courtroom scrutiny rarely seen in the technology sector.
And for the wider industry, the outcome could shape not only financial liability, but also the design norms that have defined social media for more than a decade.



