NEW DELHI: Meta Platforms has flagged concerns over India’s new rule that requires platforms to remove certain harmful content within three hours of receiving a valid order, saying the deadline may be difficult to meet in practice.
“Operationally three hours (take down window) is going to be really challenging,” Rob Sherman, vice president policy and deputy chief privacy officer, Meta, said in a media roundtable on Tuesday in New Delhi. “Traditionally, the Indian government’s been quite consultative when it comes to these things. This is an example where I think we’re concerned that had they come to us and talked to us about it, we would have talked about some of the operational challenges.”
The Centre on 10 February brought in a stricter compliance regime for social media companies such as X, Facebook, Instagram and Telegram by formally notifying amendments to the existing Information Technology Rules aimed at combating the misuse of artificial intelligence (AI) through deepfakes and other sensitive “synthetic” content. Companies falling under the intermediary definition will have to comply with the law starting 20 February.
Under the new rules, enforcement timelines for removing objectionable material have been sharply tightened. Non-consensual sexual imagery, including deepfakes, must be removed within two hours instead of 24 hours previously. Any other unlawful content must be taken down within three hours of a user report or a government or court order, compared with the earlier 36-hour window.
Sherman said the company uses a wide range of tools and techniques to spot content that violates its terms of service or community standards, but the main challenge under the new rules would be the logistics of investigating and validating requests accurately within such a short timeframe.
“Whenever we get the request from the government (to take down content), we will have to look into it, we will have to investigate it and validate it ourselves. And so that’s just something that takes some amount of time particularly if there’s something that we need to look into. That’s often not possible to turn around in three hours,” Sherman said.
The tighter timelines come as the misuse of AI through deepfakes and non-consensual sexual imagery has increasingly affected users. The government, however, has maintained that compliance should not be a challenge for platforms given their technological capabilities.
“There are certain classes of illegal content such as CSEAM (child sexual exploitative and abuse material) that may require an urgent takedown. But it will not be fair if we are treating all classes of content with same sense of severity. Especially, when such orders may come from a wide range of authorities with varying capacities.” said Dhruv Garg, partner at the Indian Governance and Policy Project (IGAP).
According to Garg, such immediate takedown actions could create a unidirectional process with no feedback mechanism, giving platforms little option but to comply with the order.
On Tuesday, communications and IT minister Ashwini Vaishnaw said the government is in talks with social media platforms on tackling deepfakes and age-based restrictions to protect society from the harms of AI.
“…At Meta, we have done a lot of work to build things like teen accounts so that there are parental controls, so that parents can make the choices that are right for them or for how their kids are using social media,” Sherman said, adding that adding that Australia-like kinds of social media bans for teens are probably not serving the goal that they are are meaning to serve.
He added that a prudent approach could be classification of teens based on their age, similar to an approach followed in the UK.
Privacy law adds to compliance burden
On the timelines to comply with the Digital Personal Data Protection (DPDP) Act, Sherman noted that while most countries provide a transition period of about two years to implement new privacy rules, the Indian government has significantly short-lived that timeline.
The rules, which came in November last year, notified that companies will need to comply with the Act’s provisions within 12–18 months, including appointing consent managers and data-protection officers, putting in place systems for express user permission, and reporting data breaches within 72 hours.
“We are still in the process of looking at what that will mean in terms of how we will comply. We have every confidence that we will do our best but we are still figuring out exactly what that looks like,” Sherman said.
Under the DPDP Rules, 2025, the government has the authority to direct that specific categories of personal data be processed and stored only within India.
Sherman said Indian government discussions on localization typically focus on “specific kinds of information that have national security implications.” He added that strict localization requirements would be logically difficult for platforms such as WhatsApp, Instagram and Facebook because they are designed for cross-border communication, which inherently requires data to be stored in multiple global locations to function.



