Introduction
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“Intermediary Rules”) under the Information Technology Act, 2000 (“IT Act”) have been effective in India since February 2021. The Intermediary Rules seek to regulate online content and impose obligations on intermediaries (online platforms that host, transmit or store third-party information, such as social media companies, messaging services, and search engines). The Intermediary Rules have been amended from time to time.
On 22 October 2025, following feedback from stakeholders, the Ministry of Electronics and Information Technology (“MEITY”), in order to bring some additional clarity and further regulate intermediaries, (i) notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2025 (“Amended Intermediary Rules”); and (ii) introduced a draft notification to further amend the Intermediary Rules to regulate synthetically or AI generated information (“Draft Notification”).
Key Provisions- Amended Intermediary Rules
The Amended Intermediary Rules, which came into effect from 15 November 2025, amend and substitute Rule 3(1)(d) of the Intermediary Rules. The extant Rule 3(1)(d) sets out the due diligence obligation of intermediaries to take down (remove or disable) any unlawful information or content, within 36 hours, of receiving ‘actual knowledge’ of such content being hosted on their platform, by way of (i) an order from a court of competent jurisdiction, or (ii) a notification from an ‘Appropriate Government’ or its agency. Appropriate Government includes both Central and State governments and appointed agencies by such governments. Meeting this obligation is essential for an intermediary to avail ‘safe harbour’ protection (i.e., protection from liability arising from third party content hosted on its platform).
Given the broad wording of the Rule, several intermediaries were receiving large volumes of overly broad and vague notices from authorities and agencies claiming to have the power to issue such notices under the Intermediary Rules, thereby raising concerns as to the legal authority and power of such agencies to issue notices and overburdening the intermediaries that wanted to ensure protection under the safe harbour rule.
Pursuant to the Amended Intermediary Rules additional safeguards have been introduced to ensure that content removal notices are more transparent, clear and reasoned. Intermediaries continue to remain obligated to takedown the unlawful obligation or content within 36 hours of receiving actual knowledge. However, the scope of ‘Appropriate Government’ that can issue notice for such removal has been clarified. Only senior officers not below the rank of Joint Secretary (or equivalent), or, where such rank is not appointed, a Director or officer of equivalent rank to the Central or State Government, may issue takedown directions. In the case of police authorities, the notice can only be issued by an officer not below the rank of a Deputy Inspector General of Police. Where an agency has been specifically authorised by the Central or State Government to exercise the powers under the Intermediary Rules, it must act through a single designated officer, creating a single point of contact for issuing takedown directions.
Further, the Amended Intermediary Rules now require that all directions must be in writing and in the form of reasoned intimations setting out the legal and statutory basis for the takedown notice and the specific URLs or digital identifiers or electronic locations of the alleged improper content that needs to be removed or disabled. All takedown notifications will be subject to a monthly review by a Secretary-level officer, who will evaluate the necessity and proportionality of such directions.
Key Provisions – Draft Notification
Against the backdrop of soaring outrage caused by deepfakes and AI/synthetically-generated content impersonating individuals including celebrities, politicians and business leaders being circulated on social media platforms and perpetuating misinformation, influencing elections and online fraud, MEITY has issued the Draft Notification to seek public comments.
The objective of the Draft Notification is to strengthen user safety, traceability, and accountability obligations for intermediaries, including social media intermediaries and significant social media intermediaries (SSMIs) (defined as platforms with more than five million registered users in India), as well as platforms that enable the creation or modification of synthetic content.
a. Defining Synthetically Generated Information
The Draft Notification defines synthetically generated information (“SGI”) to include any information that is “artificially or algorithmically created, generated, modified or altered using a computer resource, in a manner that appears authentic or true.” The Draft Notification further clarifies that all references to ‘information’ as used in the Intermediary Rules will include SGI and all provisions of the Intermediary Rules applicable to information such as due diligence, removal, grievance redressal will also be applicable in the context of SGI.
b. Due Diligence Obligations
The Draft Notification introduces a new requirement for intermediaries that ‘offer a computer resource’, which could enable, permit, or facilitate the creation, generation, modification or alteration of information as SGI, to ensure that such content is labelled or embedded with a permanent unique label, metadata or identifier. Through this identifier it should be possible to identify that such information is SGI which has been created, generated, modified or altered using the computer resource of the specific intermediary. This label must be visibly displayed (covering at least 10% of the visual surface of the SGI) or audibly marked (first 10% of audio). The intermediary must not allow removal, suppression or alteration of such labels.
c. Enhanced Obligations for SSMIs
The Draft Notification requires that SSMIs (i) obtain user declarations indicating whether displayed, uploaded, or published information is synthetically generated, (ii) deploy reasonable and proportionate technical measures to verify such declarations, and (iii) ensure that SGI is clearly labelled or accompanied by a notice. Failure to carry out the above may be treated as a breach of due diligence obligations under the Intermediary Rules.
d. Safe-Harbour Clarification
The Draft Notification clarifies that intermediaries which remove or disable access to harmful synthetic content in good faith, retain safe-harbour protection under Section 79(2) of the IT Act. However, intermediaries may face risk if SGI is published or circulated without labelling or verification, even where intent is not malicious. Platforms will need clear procedural workflows and documentation to demonstrate their good-faith efforts and to benefit from safe-harbour protection.
Conclusion
With rampant fake content creation, the intent of the Draft Notification is clear – to ensure that intermediaries take more accountability in regulating the content that is being published on their platforms and ensuring that fake content is prohibited or removed instantaneously. Having said that, the Draft Notification further increases the compliance burden on intermediaries. Intermediaries will now need to ensure additional compliance requiring investment in further technical and compliance resources. The public consultation ended in November 2025 and the final form of the notification is awaited.