Meta Allegedly Stalled Child Safety Efforts, Hid Evidence; Musk Calls Revelations ‘Terrible’
Elon Musk has described as “terrible” new allegations that Meta Platforms intentionally stalled internal efforts to prevent child predators from contacting minors and concealed evidence of social media harm.
Key Takeaways
- Court filings allege Meta designed ineffective youth safety features to avoid hindering growth
- Internal research showed Facebook deactivation reduced depression and anxiety, but was buried
- Meta required 17+ sex trafficking violations before account removal
- Mark Zuckerberg reportedly prioritized metaverse over child safety funding
Responding to a Times post on X about millions of adult strangers contacting children on Meta platforms, Musk simply stated: “Terrible”.
Court documents reveal Meta’s products exacerbated teen mental health issues, with content related to eating disorders, suicide, and child sexual abuse frequently detected but rarely addressed.
What Court Filings Reveal
According to Reuters reports, the legal documents contain several serious allegations against Meta:
- Youth safety features were intentionally designed to be ineffective to avoid impacting platform growth
- Accounts required 17+ sex trafficking violations before removal – described as “a very, very, very high strike threshold”
- Meta continued algorithm changes that increased teen exposure to harmful content despite awareness of risks
- Internal child safety efforts were stalled “for years” while staff were pressured to support this approach
The filings also show Mark Zuckerberg stating in a 2021 text that child safety wasn’t his top priority compared to “a number of other areas I’m more focused on building like the metaverse”. He reportedly dismissed funding requests from then global public policy head Nick Clegg.
Buried Evidence of Harm
The 2020 “Project Mercury” research with Nielsen found that “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison”.
Rather than publishing these findings, Meta cancelled the project, internally declaring it “tainted by the existing media narrative” despite employee assurances about validity.
One staff researcher wrote: “The Nielsen study does show causal impact on social comparison 😞” Another compared keeping quiet to tobacco industry practices: “doing research and knowing cigs were bad and then keeping that info to themselves.”
Despite the study showing causal links to negative mental health, Meta later told Congress it couldn’t quantify platform harm to teenagers.
Meta’s Response
Meta spokesman Andy Stone defended the company’s position: “The study was stopped because its methodology was flawed and that it worked diligently to improve the safety of its products. The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens.”
The allegations emerge from a class action lawsuit by Motley Rice law firm, representing school districts suing Meta, Google, TikTok and Snapchat.
Stone disputed the claims: “We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions.” He emphasized Meta’s current policy removes accounts immediately when flagged for sex trafficking.
The underlying documents remain non-public, with Meta filing a motion to strike them. A hearing is scheduled for January 26 in Northern California District Court.



