Meta Accused of Shutting Down Research Showing Facebook, Instagram Mental Health Harm
Key Takeaways
- Court filings allege Meta suppressed internal research showing Facebook and Instagram cause mental health issues
- The “Project Mercury” study found users felt less depressed and anxious after deactivating Facebook
- Meta denies allegations, claims research methodology was flawed
New court documents reveal explosive claims that Meta allegedly shut down internal research after discovering causal evidence linking Facebook and Instagram to mental health problems in users. The allegations emerge from unredacted filings in a class-action lawsuit brought by US school districts against major tech companies.
Project Mercury: The Suppressed Findings
The legal claim centers around Meta’s 2020 internal research initiative, code-named “Project Mercury.” Conducted in partnership with survey firm Nielsen, the study examined what happened when users deactivated Facebook and Instagram.
Internal documents cited in the filing show concerning results: “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison.”
Rather than publishing these findings or continuing the research, the lawsuit alleges Meta terminated the project and internally dismissed the negative outcomes, blaming them on existing “media narrative” about the company.
Internal Concerns and Tobacco Industry Comparison
Despite the public dismissal, internal communications show staff privately assured then-head of global public policy Nick Clegg that the conclusions about “causal impact on social comparison” were valid.
One staff member reportedly expressed serious concerns, comparing the situation to the tobacco industry “doing research and knowing cigs were bad and then keeping that info to themselves.”
Meta’s Response: Methodology Flaws
Meta has strongly denied the allegations. In a statement released on Saturday (November 22), company spokesman Andy Stone defended Meta’s position, stating the study was halted due to methodological flaws.
Stone emphasized Meta’s commitment to user safety: “The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens.”



