The Central Government has notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, significantly tightening the regulatory framework governing synthetically generated information, including AI-generated audio, visual and audio-visual content. The amended rules, issued by the Ministry of Electronics and Information Technology (MeitY) under the Information Technology Act, 2000, will come into force on 20 February 2026.
A key feature of the amendment is the statutory recognition and definition of “synthetically generated information”, covering any content that is artificially or algorithmically created or altered in a manner that appears real or authentic, including deepfakes. At the same time, the rules carve out exceptions for routine, good-faith editing, accessibility enhancements, educational or research materials, and technical corrections that do not distort the substance or meaning of the original content.
The amendments substantially enhance due diligence obligations of intermediaries, especially platforms that enable creation or dissemination of AI-generated content. Such intermediaries are now required to:
• Deploy reasonable and appropriate technical measures, including automated tools, to prevent the creation or spread of unlawful synthetic content, including deepfakes, child sexual abuse material, non-consensual intimate imagery, false electronic records, and deceptive portrayals of individuals or real-world events.
• Ensure mandatory and prominent labelling of all lawful synthetically generated content, either through visible on-screen notices or prefixed audio disclosures, along with permanent metadata or technical provenance markers, including unique identifiers, wherever technically feasible.
• Prohibit the removal or suppression of such labels or metadata, ensuring traceability and accountability of synthetic content
For significant social media intermediaries, the rules impose additional obligations. Platforms must require users to declare whether content is synthetically generated, deploy verification mechanisms to assess the accuracy of such declarations, and ensure that confirmed synthetic content is published only with a clear and prominent label. Failure to act against non-compliant synthetic content may result in the intermediary being deemed to have failed to exercise due diligence, jeopardising safe-harbour protection under Section 79 of the IT Act
The amendments also shorten response timelines for intermediaries in takedown and grievance redressal processes, in certain cases reducing compliance windows from hours or days to as little as two to three hours, noting the government’s emphasis on rapid action against harmful online content.

