

The “procedural” amendments recently proposed to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 broaden the remit and mechanism to monitor online content and strengthen compliance. On the face of it, as the debate on ways of curbing the deleterious effects of online content reaches a fever pitch among parents, health experts and lawmakers, the move to address “synthetically generated information” (with artificial intelligence tools), user-generated news and content moderation by social media intermediaries seems a well-meaning one. But it also raises questions about the broader mechanism for censorship and lack of clear definitions and transparency behind some of the proposals.
It’s no surprise that platforms are sought to be made more accountable. Draft Rule 3(4) takes away the discretion of intermediaries in adopting their own standards in deciding freedom of expression by requiring them to comply with “any clarification, advisory, order, direction, standard operating procedure, code of practice or guideline issued” by the electronics and IT ministry relating to due diligence obligations. A failure would remove the intermediary’s safe-harbour protection under the IT Act, 2000. The extension of the code of ethics for digital media proposed by draft Rule 8 to “news and current affairs content” created by users, who would not otherwise qualify as publishers, leaves the question hanging on who would qualify. Whoever the onus of compliance is on, the price will be paid in terms of new guardrails on the freedom of speech.
Earlier efforts to moderate online content should provide clarity on how not to proceed. While striking down Section 66A of the IT Act in Shreya Singhal (2015), the Supreme Court warned against censorship of speech on vague grounds. That instinct should be curbed, and a clear authority and written reasons mandated for taking down content. In September 2024, the Bombay High Court struck down the 2023 amendment to the IT Rules that had empowered the Centre to set up a fact-checking unit for social media. These amendments should not provide another way of bringing back such a mechanism. Feedback on the new draft has been sought till April 14. This is the moment for a broader debate about the proposed amendments’ intent and effect before the rules come close to enactment.