Meta to Face US Trial Over Illegal Content on Facebook, Instagram
Meta, the parent company of Facebook and Instagram, will stand trial in the United States following a BBC undercover investigation that found its algorithms were promoting illegal content.
Key Findings of the Investigation
A BBC Panorama probe revealed that platform algorithms were actively recommending posts and accounts involving:
- Illegal Drugs: Facebook’s algorithm promoted sales of cocaine and MDMA.
- Weapons: Posts selling guns and knives were also recommended.
- Sexual Services: Instagram’s algorithm suggested accounts offering sexual services, including those involving minors.
Meta’s Response and Acknowledged Challenges
In its response, Meta stated it has “strict policies” against such content and employs “advanced technology” for removal. The company admitted its systems are “not perfect” and it is “constantly working to improve them.”
The investigation further highlighted systemic issues within Meta’s content moderation teams. Moderators reported feeling overwhelmed and under-resourced, leading to significant delays in taking down harmful material.
The Upcoming Legal Battle
The trial, expected to start in the coming months, will determine if Meta violated US laws by allowing this promotion of illegal content. The outcome could have major implications for the tech giant, which is already under intense scrutiny for its content moderation practices .
This case is likely to reignite the global debate on the responsibility of social media companies to effectively police their platforms and ensure user safety.
Commitment Under Scrutiny
While Meta asserts it is “committed to keeping our platforms safe” and will “continue to invest in people and technology,” the trial will be the ultimate test of whether these efforts meet legal standards and adequately protect users from harm.



