Key Takeaways
- Meta faces major US lawsuit with 1,800+ plaintiffs over teen safety concerns
- Internal research allegedly shows Instagram and Facebook contribute to teen anxiety and depression
- Company accused of delaying safety features to protect engagement and revenue
- Case highlights growing concerns for Indian parents as social media use surges
Meta is facing a massive US lawsuit alleging the company knowingly exposed teenagers to harm on its platforms while delaying critical safety measures. The legal action involves over 1,800 plaintiffs including children, parents, schools, and state authorities who claim Meta prioritized growth over child safety.
Internal company research cited in court filings suggests Instagram and Facebook contribute to increased anxiety, depression, and exposure to sexual abuse content among teens. Despite these known risks, Meta allegedly delayed implementing safety features that might have reduced user engagement and profits.
Major Allegations Against Meta
Tolerance for Harmful Content
Court documents reveal Instagram allowed accounts involved in sex trafficking to remain active until 16 violations, while minor infractions like spam were removed immediately. Former safety executive Vaishnavi Jayakumar described this as a “very high strike threshold” that put children at risk.
Meta maintains it has zero-tolerance for child sexual abuse material and uses both AI systems and human reviewers to detect harmful content.
Misleading Public and Lawmakers
Internal research reportedly found teen anxiety and depression decreased when social media use was reduced, but these findings weren’t publicly shared. The company allegedly gave misleading responses to US Senate inquiries about teen mental health impacts.
Exposure to Adult Strangers
Filings detail how millions of teens faced inappropriate interactions with adults. Recommendations to make teen accounts private by default were delayed for years over engagement concerns. Instagram Reels reportedly increased stranger exposure risks.
Targeting Young Users
Meta allegedly targeted preteens to boost engagement, with internal communications comparing the strategy to cigarette companies marketing to children. The company denies allowing under-13 registrations.
Blocking Safety Measures
Executives reportedly blocked initiatives like hiding “likes,” limiting beauty filters, and reducing addictive features despite internal warnings these would improve teen wellbeing.
Addictive Design
Internal research described Instagram as a “drug” and compared the company to “pushers,” yet Meta publicly downplayed addiction risks and delayed features like “quiet mode” that might reduce engagement.
Why This Case Matters for Indian Families
This lawsuit provides rare insight into how social media companies balance safety against growth. For Indian families, where teen social media usage is rapidly increasing, the case highlights potential risks and the importance of parental controls and privacy settings.
The outcome could shape global social media safety standards as courts examine whether Meta knowingly endangered children for business benefits.
Meta’s Official Response
A Meta spokesperson stated: “We take the safety of teens seriously. We’ve introduced Instagram Teen Accounts, AI systems to detect harmful content, and parental tools. Our platforms continue to evolve to provide experiences that are not only safe but also age-appropriate. Allegations that we deliberately harm teens are false.”
The case continues to unfold in California courts as part of broader multidistrict litigation that also involves complaints against YouTube, Snapchat, and TikTok.






