Introduction
A single “intimation” may now decide whether a tweet, reel, or post survives online. The recent amendments to Rule 3(1)(d) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“IT Rules, 2021”), mark a quiet but consequential shift in India’s content moderation framework. Moving beyond the long-debated “actual knowledge” standard under Shreya Singhal v. Union of India, the government has introduced new safeguards to ensure that takedown orders emanate only from “authorised officers” and are accompanied by reasoned intimations to intermediaries.
On paper, the change aims to prevent arbitrary takedowns, strengthen procedural accountability, and address the growing menace of deepfakes and unlawful content. However, the amendment also expands the architecture of state intervention into online speech, raising old questions about proportionality, transparency, and institutional capacity. As these changes take effect from November 15, 2025, they present an opportunity to reassess India’s safe-harbour regime and its delicate balance between regulatory control and free expression.
Regulatory Landscape and the Safe-Harbour Threshold
India’s intermediary regime rests on a deceptively simple hinge: Section 79 of the Information Technology Act, 2000 (“IT Act, 2000”) grants conditional immunity to intermediaries who, having no “actual knowledge” of unlawful content, follow prescribed due diligence norms. The IT Rules, 2021, the principal regulatory instrument that operationalised those norms, required intermediaries to act on either court orders or notifications from the Appropriate Government before removing content under Rule 3(1)(d).
Two tensions have run through this architecture since the Shreya Singhal judgment:
- Courts have insisted that removal be traceable to a clear legal standard and be susceptible to review.
- Intermediaries and civil-society actors have repeatedly complained that government interventions were often amorphous, leaving platforms with little to work on and users with little recourse.
The result was a contested middle ground:
- Intermediaries seeking to preserve safe-harbour,
- State agencies seeking swift takedowns, and
- Courts urging reasoned, reviewable administrative action.
Recent litigation and reporting have made this fault line visible; judges have pressed for clearer procedures, and NGOs have called for greater transparency in takedown practices.
The 2025 Amendments to Rule 3(1)(d)
The 2025 amendments to Rule 3(1)(d) are, in part, a direct response to these frictions. They reframe the state’s “notification” power into a duty to issue reasoned intimations that identify:
| Required Information | Description |
|---|---|
| Statutory Provision Relied Upon | The legal basis for action against the content. |
| Nature of Alleged Unlawful Act | Explanation of how the content violates the law. |
| Precise URL or Electronic Location | Specific identification of the content to be removed. |
They also require that such intimations be issued only by officers of specified seniority and be subject to monthly Secretary-level review. The government frames these changes as aligning takedown practice with the Section 79(3)(b) requirement of “actual knowledge” and as measures to reduce arbitrariness while improving intermediaries’ ability to assess claims.
Implementation Questions and Doctrinal Alignment
However, doctrinal alignment does not end implementation questions. “Actual knowledge” is not merely a formal checklist; it presupposes that intermediaries receive adequate factual and legal information to judge whether content is unlawful.
The amendments arguably advance that aim by demanding specificity; yet they also centralise decision-making and insert new procedural layers. That trade-off (better information for platforms versus potential delays and concentration of discretion) is the governing policy problem that this piece examines. Reporting on the reforms anticipates both the promise of greater accountability and concerns about bureaucratic bottlenecks and operational readiness.
What’s New: The Core Elements of the 2025 Amendments
The recent changes to IT Rules, 2021, via the amendment to Rule 3(1)(d) usher in five principal features that mark a shift in India’s takedown regime.
Senior-Level Authorisation
Under the amended regime, removal intimations must now be issued by officers of a certain seniority: for the Appropriate Government (central or state), only an officer of the rank of Joint Secretary or above may issue a takedown intimation, and for police requests, an officer of Deputy Inspector General of Police (DIG) or above must sign off.
This represents a clear shift away from lower-rank functionaries initiating takedowns without the procedural safeguards of higher-level oversight. The aim appears to be to embed greater accountability into the process and reduce arbitrary or unsupervised removals.
Reasoned Intimations with Specific Content Identifiers
A core innovation is the mandatory inclusion of a detailed “reasoned intimation” when notifying an intermediary.
| Required Element | Description |
|---|---|
| Statute/Provision Violated | Specify the legal basis for claimed violation. |
| Precise Content Identification | URL or unique content identifier. |
| Description of Unlawful Act | Clarifies why content is considered unlawful. |
This contrasts with earlier practice, where government or police notices often lacked specificity, leaving intermediaries to interpret vague or sweeping orders. The amendment seeks to align more closely with the safe-harbour threshold of “actual knowledge” under Section 79(3)(b) by equipping intermediaries with actionable information rather than general demand for removal.
Monthly Secretary-Level Review of Intimations
The amended rules introduce a periodic administrative review requirement: every month, a Secretary-level official must review all takedown intimations issued in the preceding period.
- Ensure compliance with procedure
- Verify officer rank of authorising authority
- Confirm clarity of content identification
- Check sufficiency of information provided to intermediaries
This built-in check is intended to act as a supervisory mechanism over the state’s removal machinery and provide an internal audit of compliance.
Framing the Amendments in the Context of Emerging Challenges
The government simultaneously links these changes to broader digital ecosystem threats, such as deepfakes and manipulated audio/video content, emphasising the need for timely and precise takedowns of harmful content.
- Labelling of AI-generated content
- Stricter rules for removal of synthetic media
- Faster response to viral harmful content
The message: India’s intermediary liability regime must now adapt to the age of rapid viral operations.
Operational Implications for Intermediaries and Platforms
Practically, these changes mean that platforms must track not only takedown notices but also metadata about authorising officers, content identifiers, and monthly compliance logs for review.
- Higher documentation and auditing burden
- Increased justifiability requirements for government/state agencies
- Potential slowdown in urgent takedowns without emergency pathways
Legal Analysis: How the Amendments Square with Safe-Harbour, Reviewability, and Free Speech
At their core, the 2025 amendments attempt to reconcile three competing imperatives:
- The intermediary’s safe-harbour
- The citizen’s right to free expression
- The state’s duty to maintain order
The Supreme Court’s decision in Shreya Singhal v. Union of India (2015) confined “actual knowledge” to court orders or government notifications conforming to law. The 2025 amendments retain this architecture but raise the procedural threshold for what qualifies as a valid “intimation.”
Relevant Case Law Illustrations
| Case | Court | Key Takeaway |
|---|---|---|
| Sadhguru Jagadish Vasudev & Anr v. Igor Isakov & Ors (2024) | Delhi HC | Platforms must use technology to remove misleading content while respecting immunity. |
| TV Today Network Ltd. v. News Laundry Media Pvt Ltd (2025) | Delhi HC | Need for reasoned takedown processes to prevent arbitrary removals. |
| X Corp. v. Union of India (2025) | Karnataka HC | Subjective morality cannot dictate removals; clear legal basis required. |
| Software Freedom Law Centre v. State of NCT of Delhi (2025) | Delhi HC | Administrative review without transparency is insufficient. |
While the requirement of reasoned orders is positive, case law highlights that platforms’ over-compliance remains a concern. Unless the amendments translate into transparency and user recourse, constitutional rights risk being subordinated to administrative convenience.
Capacity Challenges
According to the government’s own figures, India receives over 5,000 content removal requests daily. Each must now pass through a designated officer with a reasoned intimation.
- Need for significant bureaucratic capacity
- Risk of severe delays
- Risk of informal takedowns continuing without documentation
Implementing a multi-tiered officer-led model presupposes infrastructure and training that many departments lack.
Comparative Reflections: Lessons from the UK, EU & US
European Union: Digital Services Act (DSA)
- Notice and action framework
- Counter-notice rights for users
- Mandatory transparency reports
- Independent audits
India’s model lacks user participation and transparency obligations.
United Kingdom: Online Safety Act
- “Duty of care” model
- Regulation by Ofcom
- Internal risk assessments required
- User redress systems encouraged
India echoes some aspects but without an independent regulator.
United States: Section 230
- Broad immunity for platforms
- Minimal state interference
- Strong First Amendment orientation
India is moving in the opposite direction by centralising state control.
Taken together, these models highlight India’s distinctly executive-centric trajectory, privileging coordination and control over co-regulation and user rights.
Policy Recommendations: Bridging The Gap Between Law And Practice
The 2025 amendments are a step toward procedural rationalisation, but without institutional reinforcements, they risk remaining normatively sound yet operationally fragile. To translate the new framework into credible governance, three areas of reform stand out:
Institutionalise Transparency And Oversight Mechanisms
Currently, there is no statutory obligation to disclose takedown data, grounds for removal, or outcomes of the Secretary-level review. The government should adopt a transparency-by-default approach akin to the EU’s DSA:
- Mandate quarterly transparency reports from both the government and major intermediaries, detailing the number, source, and categories of removal requests.
- Publish aggregated review findings to allow for public and parliamentary scrutiny.
- Require intermediaries to maintain a searchable public archive of takedown orders (with redactions for privacy or national security).
This would not only improve accountability but also build the empirical foundation that India’s enforcement framework presently lacks, transforming anecdotal oversight into a data-driven feedback loop.
Build Regulatory And Technological Capacity
The success of the “authorised officer” model depends on administrative bandwidth. Currently, most state-level IT cells lack specialised training in constitutional speech standards or digital forensics. MeitY should consider:
- Creating a Digital Content Regulation Division within the ministry, staffed with technologists, legal experts, and policy analysts.
- Developing AI-assisted dashboards to triage and monitor takedown requests, improving speed without sacrificing due process.
- Partnering with civil-society organisations and academic institutions to build capacity around procedural proportionality and rights-based moderation.
Without such investments, even the best-designed safeguards will be undermined by bottlenecks and inconsistent implementation.
Embed User Rights And Procedural Fairness
A long-term equilibrium requires more than safe-harbour protection for intermediaries; it demands speech-harbour protection for citizens. The Rules should therefore:
- Introduce a statutory counter-notice mechanism that enables users to appeal takedowns.
- Mandate reasoned communication to affected users when their content is removed.
- Encourage the creation of a Grievance Review Board, either under MeitY or an independent statutory body, to adjudicate content disputes in a time-bound manner.
Together, these reforms would align India’s takedown framework with emerging global standards while preserving constitutional safeguards. The aim should not be to eliminate unlawful content faster, but to ensure that every removal is lawful, necessary, and proportionate. Only then can India’s digital governance model command both legitimacy and trust.
Conclusion
The 2025 amendments to Rule 3(1)(d) mark a significant step in India’s intermediary regulation, formalising authorised officers, reasoned orders, and periodic review. These procedural safeguards enhance predictability and provide a framework for lawful takedowns. Yet, procedural improvements alone cannot guarantee constitutional safeguards. Without transparency, public reporting, or user-oriented appeal mechanisms, removals risk remaining opaque, and intermediaries may continue over-compliance out of caution.
Globally, intermediary regulation increasingly emphasises accountability through disclosure rather than centralised control. India’s model, while improving procedural rigour, remains executive-centric. Strengthening institutional capacity, embedding transparency, and enabling user recourse are essential to ensure that takedowns are lawful, proportionate, and rights-respecting. Ultimately, the legitimacy of content removal rests not only on the rules themselves but on how consistently and transparently they are enforced, the true measure of a free, secure, and democratic internet.
References:
- https://www.pib.gov.in/PressReleasePage.aspx?PRID=2181719
- https://www.meity.gov.in/static/uploads/2024/02/Information-Technology-Intermediary-Guidelines-and-Digital-Media-Ethics-Code-Rules-2021-updated-06.04.2023-.pdf
- https://indiankanoon.org/doc/110813550/
- https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=105
- https://forum.nls.ac.in/ijlt-blog-post/rule-31b-intermediary-liability-and-the-burden-of-reasonable-efforts/
- https://cyber.harvard.edu/story/2022-07/toward-best-practices-around-online-content-removal-requests
- https://www.law.ox.ac.uk/sites/default/files/migrated/opbp_report-_regulation_of_digital_media_and_intermediaries.pdf
- https://www.pib.gov.in/PressReleasePage.aspx?PRID=2175963
- https://www.law.ox.ac.uk/sites/default/files/migrated/opbp_report-_regulation_of_digital_media_and_intermediaries.pdf
- https://www.pib.gov.in/PressReleasePage.aspx?PRID=2181719
- https://www.pib.gov.in/PressReleasePage.aspx?PRID=2181719
- https://www.pib.gov.in/PressReleasePage.aspx?PRID=2181719
- https://www.meity.gov.in/static/uploads/2025/10/8e40cdd134cd92dd783a37556428c370.pdf
- https://www.bricscompetition.org/news/india-tightens-rules-on-online-content-removal-and-cracks-down-on-deepfakes-across-social-media?utm_source=chatgpt.com
- https://indiankanoon.org/doc/1218090/
- https://images.assettype.com/barandbench/2025-06-02/urxxbv0x/_Sadhguru_Jagadish_Vasudev___Anr_V__Igor_Isakov___Ors_.pdf
- https://globalfreedomofexpression.columbia.edu/cases/tv-today-network-limited-v-news-laundry-media-private-limited/
- https://indiankanoon.org/doc/7614885/
- https://forum.nls.ac.in/nlsir-online-blog/expanding-the-chilling-effect-doctrine-through-kunal-kamra/
- https://thewire.in/culture/governments-requests-reddit-content-removal
- https://theprint.in/india/india-3rd-among-countries-sending-google-most-content-removal-requests-20000-over-last-10-years/1877241/
- https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package
- https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer
- https://www.columbia.edu/~mr2651/ecommerce3/2nd/statutes/CommunicationsDecencyAct.pdf
- https://harvardlawreview.org/print/vol-131/section-230-as-first-amendment-rule/
- https://www.law.ox.ac.uk/sites/default/files/migrated/opbp_report-_regulation_of_digital_media_and_intermediaries.pdf
- https://repository.nls.ac.in/cgi/viewcontent.cgi?article=1076&context=ijclp
- https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package
- https://cyber.harvard.edu/story/2022-07/toward-best-practices-around-online-content-removal-requests
- https://theleaflet.in/due-process/iff-and-digital-rights-activists-criticise-centres-new-digital-regulation-guidelines-as-unconstitutional
- https://thewire.in/rights/online-gaming-regulation-it-rules
- https://pmc.ncbi.nlm.nih.gov/articles/PMC6390894/
- https://cyber.harvard.edu/publication/2019/content-and-conduct
- https://academic.oup.com/book/27505
- https://www.pib.gov.in/PressReleasePage.aspx?PRID=1894258
- https://forum.nls.ac.in/ijlt-blog-post/online-fake-news-paving-the-way-forward/


