New rules for labelling AI content soon: MeitY secretary S Krishnan
The government has finished consultations with the industry on its proposal to make labelling of AI-generated content mandatory, and the new rules will be issued soon, a senior official from the Ministry of Electronics and Information Technology (MeitY) said. According to a report, MeitY Secretary S Krishnan said the industry has been fairly responsible and understands why AI-generated content needs to be labelled. He said there has been no major opposition to the proposal.
However, companies have asked for clarity on what level of AI use would require a label. They want clear guidelines to differentiate between major changes made using AI and routine technical improvements, such as camera enhancements that improve quality but do not change facts.
Krishnan said the government is now discussing these suggestions with other ministries. “We are deciding which changes to accept, which to modify, and what adjustments to make. This process is ongoing, and the new rules should be released very shortly,” he said.
He added that the government is not imposing new restrictions or asking platforms to register with any third party. “All that is being asked is to label the content,” Krishnan said, stressing that people have the right to know whether content is real or created using AI.
The report also mentioned that Krishnan also noted that even small AI edits can sometimes change the meaning of content, while routine technical enhancements may only improve quality without altering facts.
In October, the government proposed changes to the IT Rules to make labelling of AI-generated content compulsory and to increase the responsibility of big platforms like Facebook and YouTube to identify and flag such content. The aim is to reduce harm from deepfakes and misinformation.
The IT ministry said deepfake videos, audio clips, and other synthetic media shared online can create convincing false information. Such content can be misused to spread misinformation, harm reputations, influence elections, or commit financial fraud.
The draft rules require platforms to clearly label AI-generated or modified content using visible markers or metadata. For visual content, the label must cover at least 10% of the screen, while for audio, it must appear in the first 10% of the clip.

