top of page

C&M E-Alert: Ministry of Electronics and Information Technology notifies the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026

  • Writer: Karan Singh Chandhiok
    Karan Singh Chandhiok
  • Feb 17
  • 11 min read

On 10th February 2026, the Ministry of Electronics and Information Technology ("MeitY") notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026 ("Notified Rules") to regulate synthetically generated information (“SGI”), including deepfakes and other AI-generated content, and to impose enhanced due diligence obligations on intermediaries and Significant Social Media Intermediaries (“SSMI”). The notification of the Notified Rules was preceded by circulation of draft amendments ("Draft Rules") to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 ("IT Rules") which was released for public consultation on 22nd October 2025. The Notified Rules primarily aim to bring synthetically generated content within India’s intermediary regulatory framework. They require digital platforms to ensure such content is clearly labelled and embedded with metadata, and to take prompt action when unlawful synthetic content is identified. The Notified Rules also introduce stricter due-diligence obligations for intermediaries and shorten takedown timelines to enhance accountability and transparency in online content moderation. The Notified Rules will come into effect on 20th February 2026. On 10th February 2026, along with the Notified Rules, MeitY also published Frequently Asked Questions (“FAQs”) in furtherance of the Notified Rules to bring clarity, as well as to explain certain nuances pertaining to the due diligence requirements introduced by the Notified Rules in relation to SGI, and other associated concerns.


KEY DEFINITIONS

Terms

Definition

Illustrative example

Audio, visual or audio-visual information

Any audio, image, photograph, graphic, video, moving visual recording, sound recording or any other audio, visual or audio-visual content, with or without accompanying audio, whether created, generated, modified or altered through any computer resource.

A user uploads a video, a profile photograph, or a voice note.

 

Note: The FAQ clarifies that, pure text or written outputs, by themselves, do not constitute SGI, as SGI is limited to audio, visual, or audio-visual information.

Synthetically generated information

Audio, visual or audio-visual information which is artificially or algorithmically created, generated, modified or altered using a computer resource, in a manner that such information reasonably appears to be real, authentic or true, and depicts or portrays any individual or event in a manner that is, or is likely to be perceived as indistinguishable from a natural person or real-world event.

An AI-generated video or a voice clip of a public figure appearing to endorse a product.


Certain Exceptions - The Notified Rules provide certain exemptions for audio, visual or audio-visual information from classified as SGI if it falls within the following categories:

Routine or good-faith editing

Editing, formatting, enhancement, technical correction, colour adjustment, noise reduction, transcription, or compression that does not materially alter, distort, or misrepresent the substance, context, or meaning of the underlying audio, visual or audio-visual information.

Adjusting brightness and colour balance on a poorly lit video recording, removing background noise from the audio track, and compressing the file size for distribution.

Routine or good-faith creation of documents and materials

Creation, preparation, formatting, presentation or design of documents, presentations, PDF files, educational or training materials, research outputs, including the use of illustrative, hypothetical, draft, template-based or conceptual content, where such creation or presentation does not result in the creation or generation of any false document or false electronic record.

Creating a PowerPoint presentation with template slides, sample charts, and hypothetical financial data to train employees on quarterly reporting procedures.

 

Note: If AI tools are used to generate false certificates, false official letters, forged identification documents, or fabricated electronic records, such content shall not fall within the scope of this exemption and may be treated as unlawful SGI

Accessibility and quality improvements:

Use of computer resources solely for improving accessibility, clarity, quality, translation, description, searchability, or discoverability, without generating, altering, or manipulating any material part of the underlying audio, visual or audio-visual information.

Adding closed captions and audio descriptions to a training video for hearing or visually impaired audience, translating user interface text into multiple languages, or creating searchable metadata tags for a document library.

Do Note

Not every AI-assisted creation or edit amounts to SGI. Content is regarded as SGI only where it has been artificially or algorithmically generated or modified so as to present itself as real, authentic, or true, and is likely to be indistinguishable from an actual individual or a real-world event. Routine, good-faith edits, along with improvements made for accessibility or document preparation, are expressly excluded under Rule 2(1)(wa) of the IT Rules.

SCOPE EXPANSION

The Notified Rules reflect an expansion in the applicability and scope by expressly bringing SGI within the regulatory framework applicable to “information” in the context of unlawful acts. A clarification has been introduced in the Notified Rules to provide that references to “information” under Rule 3(1)(b) and Rule 3(1)(d), as well as Rules 4(2) and 4(4) of the IT Rules, shall be construed to include SGI, unless the context otherwise requires.

By virtue of this clarification, intermediaries and SSMIs are required to extend their existing due diligence obligations to SGI, including the obligation to make reasonable efforts to prevent the dissemination of unlawful content, to remove or disable access to such content upon receipt of appropriate government notification or court orders, and to comply with applicable tracing and monitoring requirements.

The Notified Rules further provide that removal of, or disabling access to, any information, including SGI, data, or communication links by an intermediary in compliance with the IT Rules shall not be treated as a contravention of the safe harbour conditions set out under Section 79(2) of the the Information Technology Act, 2000 (“IT Act”).

REVISED OBLIGATIONS

User Obligations

The Notified Rules strengthen user obligations and require intermediaries to remind users of such obligations more frequently. The amendment to Rule 3(1)(c) of the IT Rules now requires intermediaries to inform users at least once every three (3) months rather than once a year about the consequences of violating their rules, privacy policy, or user agreement. This information must be shared in a simple and effective manner, through the intermediary’s terms and conditions, rules, privacy policy, user agreement, or any other appropriate means, in English or any language listed in the Eighth Schedule to the Constitution.         

 

The Notified Rules require intermediaries to act much faster on receiving “actual knowledge” of any unlawful content present on the platform. If such knowledge is received through a court order or a reasoned notice from an authorised officer of the Appropriate Government or its agency, the intermediary must remove or disable access to the specified content within three (3) hours of receiving such order or notice. Earlier, intermediaries had thirty six (36) hours to comply; this has now been reduced to three (3) hours.          

 Takedown of Unlawful Information

The Notified Rules require intermediaries to act much faster on receiving “actual knowledge” of any unlawful content present on the platform. If such knowledge is received through a court order or a reasoned notice from an authorised officer of the Appropriate Government or its agency, the intermediary must remove or disable access to the specified content within three (3) hours of receiving such order or notice. Earlier, intermediaries had thirty six (36) hours to comply; this has now been reduced to three (3) hours.

ACCELERATED TAKEDOWN AND COMPLIANCE TIMELINES

Pursuant to the Notified Rules, the prescribed timelines for intermediaries to address user grievances and act upon specified categories of content have been materially revised and shortened. The revised timelines are:

Action(s)

Previous Timeline (prior to Notified Rules )

Revised Timeline (Notified Rules)

Resolution of user grievances

Within 15 days of receiving the complaint

Within 7 days of receiving the complaint

Takedown of specified harmful content (e.g., obscene content, content harmful to children)

Within 72 hours of reporting

Within 36 hours of reporting

Removal of intimate or impersonation content (e.g., content exposing private areas, depicting nudity or sexual acts, or involving impersonation including morphed or manipulated images)

Within 24 hours of receiving the complaint

Within 2 hours of receiving the complaint

DUE DILIGENCE BY INTERMEDIARIES OFFERING COMPUTER RESOURCES FOR CREATION OF SGI

The Notified Rules amend Rule 3 of the IT Rules to introduce comprehensive due diligence requirements for intermediaries offering computer resources that enable, permit, or facilitate the creation, generation, modification, alteration, publication, transmission, sharing, or dissemination of SGI, in the following manner:

  1. Deployment of preventive technical measures: The Notified Rules introduce a new requirement mandating intermediaries to deploy reasonable and appropriate technical measures, including automated tools or other suitable mechanisms, to prevent users from creating, generating, modifying, altering, publishing, transmitting, sharing, or disseminating SGI that violates any law, including the IT Act, Bharatiya Nyaya Sanhita, 2023, POCO Act 2012, and Explosive Substances Act, 1908. Specifically, intermediaries must prevent the creation or dissemination of SGI that:

    • Contains child sexual exploitative and abuse material, non-consensual intimate imagery content, or is obscene, pornographic, paedophilic, invasive of another person's privacy (including bodily privacy), vulgar, indecent or sexually explicit;

    • Results in the creation, generation, modification or alteration of any false document or false electronic record;

    • Relates to the preparation, development or procurement of explosive material, arms or ammunition; or

    • Falsely depicts or portrays a natural person or real-world event by misrepresenting, in a manner that is likely to deceive, such person's identity, voice, conduct, action, statement, or such event as having occurred.

  2. Labelling and metadata requirements: The Notified Rules require intermediaries to ensure that such SGI is prominently labelled in a manner that ensures “prominent visibility” of disclosure, that is easily noticeable and adequately perceivable, or, in the case of audio content, through a prominently prefixed audio disclosure. The labelling must allow immediate identification of such information as having been synthetically generated, created, modified or altered using a computer resource (for example, a lawful SGI or AI-generated video should carry a visible 'synthetically generated' notice). Unlike the Draft Rules, which required labels to cover at least ten (10) percent of the surface area of visual content or, in the case of audio content, during the initial ten (10) percent of its duration, the Notified Rules provide intermediaries with discretion to determine how to prominently label such content.

Additionally, the Notified Rules require SGI to be embedded with permanent metadata or other appropriate technical provenance mechanisms, to the extent technically feasible, including a unique identifier, to identify the computer resource of the intermediary used to create, generate, modify or alter such information.

  1. Prevention of removal or suppression: Intermediaries are required to prevent any modification, suppression or removal of the label, permanent metadata, or unique identifier displayed or embedded in the SGI.

The Draft Rules requirement to cover at least ten (10) percent of the surface area of visual content or, in the case of audio content, during the initial ten (10) percent of its duration, was highly criticised by stakeholders and industry participants. The rationale behind requiring 10% of the content to be covered by the identifier appeared arbitrary, and applying a uniform threshold across varying lengths and formats of content was considered impractical and disruptive to user experience. This requirement, coupled with the wide definition of SGI under the Draft Rules, which covered a broad range of benign content, could have caused extensive labelling of content and risked desensitising users and creating notification fatigue.

The Notified Rules have addressed this concern by granting intermediaries discretion to determine how to "prominently" label visual content or prefix audio disclosures. While this is a positive development for industry providing flexibility to intermediaries to determine how content should be labelled, it introduces interpretive challenges regarding what constitutes sufficient prominence, potentially leading to inconsistent implementation across platforms.

Additionally, the obligation on an intermediary for labelling and embedding of metadata or identifiers is limited to the SGI created or modified using its own computer resource (for example, an AI-powered image or video generation tool offered by the intermediary). It may therefore be interpreted that an intermediary is not required to label SGI that was created elsewhere and merely hosted on its platform. However, while this interpretation may provide some operational flexibility with respect to labelling and metadata or identifiers embedding requirements, it should not be construed as relieving intermediaries of their other statutory obligations under the IT Rules including the obligation to takedown unlawful content, grievance redressal timelines, and other due diligence obligations in respect of SGI hosted on their platforms, irrespective of whether such content was created using their own computer resources. Additionally, the blanket requirement for embedding SGI with permanent unique metadata or identifiers remains a heavy compliance burden.

ADDITIONAL DUE DILIGENCE REQUIREMENTS FOR SSM

The Notified Rules propose enhanced due diligence obligations for SSMIs that enable users to display, upload, or publish information on their platforms. As defined under the IT Rules, a SSMI refers to a social media intermediary having more than fifty (50) lakh registered users in India, where a “social media intermediary” is an intermediary that primarily or solely enables online interaction between users and allows them to create, upload, share, disseminate, modify, or access information through its services. The Notified Rules require SSMIs to observe the following measures:

  1. Prior to displaying, uploading, or publishing any information, the SSMI must require users to declare whether the information is synthetically generated.

  2. The intermediary must deploy reasonable and appropriate technical measures, including automated tools or other suitable mechanisms, to verify the accuracy of such declarations, taking into account the nature, format, and source of the information.

  3. Where such declaration or technical verification confirms that the content is synthetically generated, the intermediary must clearly and prominently display an appropriate label or notice indicating that the content is synthetically generated.

The Notified Rules clarify that if it is established that the SSMI knowingly permitted, promoted, or failed to act upon SGI in violation of the Notified Rules, it would be deemed to have failed to exercise due diligence and hence risk losing the safe harbour protection under Section 79 of the IT Act. For clarity, the Notified Rules specify that the responsibility of a SSMI extends to taking reasonable and proportionate technical measures to verify user declarations and to ensure that no SGI is published without the required declaration or label.

These obligations were initially proposed in the Draft Rules and despite industry concerns regarding practical implementations, challenges and technical limitations of verification tools, have been retained in the Notified Rules without substantive modification.  

Considering the heightened obligations of SSMIs, in practice, SSMIs will need to deploy verification tools tailored to the nature, format, and source of uploaded information. For instance, when a user uploads a video, the platform's automated system may analyse metadata, visual patterns, and audio signatures to verify whether the content is synthetically generated. If the user declares the content as synthetically generated, or if the technical measures confirm it as such, the platform must apply a clear and prominent label such as "This video was created using AI" or "Synthetically generated content" before allowing publication.

Furthermore, the liability implications of these requirements are significant. For example, if a user uploads deepfake content depicting a public figure making false statements that clearly violates the prohibition on SGI by falsely portraying a natural person in a manner likely to deceive, and the SSMI's technical measures fail to detect this content due to limitations in the automated tools, or because the verification mechanisms were not sufficiently robust to evaluate the nature of such content, and the content is published without the required declaration or label, the SSMI risks losing its safe harbour protection. In such cases, the platform could be held liable for hosting unlawful content, facing potential legal action and penalties. Given these heightened obligations and liability risks, SSMIs should proactively invest in robust, scalable verification infrastructure that combines automated detection tools with human oversight mechanisms.

CONCLUSION

Under the IT Rules, intermediaries were required to inform users of the intermediary's right to terminate access or remove non-compliant information in case of violations, only once a year. However, the Notified Rules have significantly increased the compliance burden by requiring all intermediaries to notify the users with such information every three (3) months, along with mandating additional disclosures as prescribed above. Intermediaries must ensure that such information is communicated through suitable methods such as email notifications, pop-ups, in-app notifications, or other effective communication channels. This enhanced frequency of user notifications, coupled with the expanded scope of disclosures, reflects MeitY's intent to ensure greater transparency and user awareness.

The reduced timelines for grievance handling and information takedown represents MeitY’s intent to ensure prompt action against such content, particularly in cases involving sensitive matters such as intimate content and impersonation. However, such reduced timelines also impose significant operational, and compliance demands on intermediaries, requiring strong moderation infrastructure and establishment of mechanisms to meet these revised timelines.


In case of feedback, suggestions, or queries, please reach out to the C&M team on



Comments


bottom of page