In sharp contrast to its previous position on the approval process for platforms using AI models, the Ministry of Electronics and Information Technology has withdrawn the requirement for platforms to seek government approval for AI models under development. As per the new advisory issued on March 15, 2024, MeitY has emphasized the importance of labeling AI-generated content, especially content that is susceptible to exploitation with deepfake technology.
The previous advisory, issued on March 1, was heavily criticized by many startup founders as a bad move.
Key points of the advisory include:
1. Under the new format, intermediaries are no longer required to submit action progress reports, but are still required to comply with immediate effect. The mandates of the revised recommendations remain the same, but the language has been toned down.
2. The new recommendations highlight concerns about the laxity of intermediaries and platforms in carrying out the due diligence required by existing IT rules.
3. All intermediaries and platforms are now required to properly label AI-generated content, especially content that is susceptible to deepfake manipulation.
4. Platforms are directed to deploy AI models that prevent users from posting and sharing illegal content.
5. The new advisory maintains MeitY's focus on making it easy to identify all deepfakes and misinformation. It therefore recommends that intermediaries label or embed “unique metadata or identifiers” in their content. Content can be in audio, visual, textual, or audiovisual format.
6. MeitY also uses this label to identify that the Content is artificially generated/modified/created and that the intermediary's computer resources were used to make such modifications. , metadata, or a unique identifier.
The notice was issued to eight major social media intermediaries: Facebook, Instagram, WhatsApp, Google/YouTube (for Gemini), Twitter, Snap, Microsoft/LinkedIn (for OpenAI), and ShareChat. They issued an advisory on deepfakes in December 2023 and an advisory on March 1st.