Tech
Govt Directs Social Media Platforms to Remove Pornographic and Obscene Content
The IT Ministry warns platforms to proactively remove obscene content or face action under IT Rules, 2021.
The Ministry of Electronics and Information Technology (MeitY) has issued an advisory directing social media platforms and digital intermediaries to proactively take down content that is obscene, pornographic, or otherwise unlawful. Issued on December 29, 2025, the advisory reiterates obligations under the Information Technology Rules, 2021, and notes that a new rulebook is expected to come into effect from the first week of January, along with the formation of a dedicated committee to oversee implementation and enforcement of the updated provisions. The government has warned that failure to comply with these rules will invite legal action against platforms and intermediaries.
Stricter Obligations for Large Platforms
The advisory places special responsibility on large social media platforms, defined as those with more than 50 lakh users. Such platforms are required to deploy technology-based solutions to automatically detect and remove objectionable content from their services.
According to the Ministry, large platforms were found to be missing or failing to adequately screen an increasing volume of obscene content, prompting the need for renewed enforcement.
Also read: Sudha Murty Highlights Risks in Child Influencer Culture, But Is Balance Possible?
Reference to IT Rules, 2021
The advisory reiterates that intermediaries must make reasonable efforts to ensure users do not host, upload, publish, transmit, store, or share content that is obscene, pornographic, paedophilic, harmful to children, or otherwise illegal.
The rules apply across all forms of digital content and platforms, without an exception.
No Safe Harbour Protection for Non-Compliance
The Ministry has clarified that failure to follow due diligence requirements under the IT Rules will result in loss of protection under Section 79 of the IT Act. This protection, commonly referred to as safe harbour, shields platforms from legal liability for user-generated content
Without safe harbour, platforms can be held legally responsible for content posted by users and may face court proceedings.
Mandatory Takedown Timelines
Under Rule 3(2)(b) of the IT Rules, intermediaries are required to remove or disable access to content depicting an individual in a sexual act, including impersonation, within 24 hours of receiving a complaint from the affected person or someone acting on their behalf.
The advisory reinforces that this timeline is mandatory and non-negotiable.
Legal Consequences for Violations
The Ministry has stated that non-compliance with the IT Act and IT Rules, 2021, may lead to prosecution under the IT Act, the Bharatiya Nyaya Sanhita, and other applicable criminal laws. This applies not only to platforms and intermediaries but also to users involved in hosting or sharing prohibited content.
Background and Supreme Court Context
The advisory follows recent observations by the Supreme Court, which urged the Union government to take action on rising obscenity on the internet. In the past month, the government has blocked nearly 25 India-based OTT platforms that were found to be specialising in erotic content.
The government has also proposed introducing broader legal language to explicitly prohibit online obscenity in India.
Enforcement Going Forward
While the Ministry has not cited a specific incident that triggered the advisory, it has emphasised that enforcement will continue and platforms are expected to strengthen content moderation systems immediately.
The advisory underscores that digital intermediaries must actively prevent unlawful content rather than rely only on user complaints.
