

To believe or not to believe—that is the question confronting most of us in these times of information inundation. As news and opinions are constantly streamed across digital platforms, information is morphed into misinformation and disinformation. Misinformation is inaccurate or false information; but when weaponised with intent, it becomes disinformation. Malinformation, which is taken out of context with a motive to malign, is also on the rise in interpersonal contexts.
Disinformation, on the other hand, targets groups of people with the intent to manipulate and deceive. It can affect situations as diverse as electoral results and armed conflicts. It also distorts the debate surrounding significant issues like climate change and public health. Of late, it has gained velocity and influence, even carving out alternate realities. Troll farms and bot farms pushing particular narratives have hijacked many a public discourse.
Rumours and propaganda flourish in times of crisis. However, they have rarely had the kind of traction seen in present times. There was a time they could be debunked with facts and logic. But with the rise of real-time communication, online broadcasting, the use of artificial intelligence and algorithmic biases, the digital media space has become an echo chamber where false and misleading information is amplified. The World Economic Forum in its Global Risks Report 2025 stated that misinformation and disinformation pose a persistent threat to societal cohesion and governance, as they erode trust within and across societies.
In recent memory, we experienced misinformation and disinformation thriving during the pandemic. Playing on fear and anxiety, inauthentic remedies were peddled and the efficacy of vaccines was questioned. The origin of the virus is still a subject of disinformation campaigns involving state players. This vulnerability is also seen during times of war. State actors become active in cranking up the disinformation machinery along with war strategies. A disinformation war inflicts damage on the minds of people. The deep penetration of such campaigns in war zones ensures that citizens are ready to be fodder for the cannon.
In democracies across the world, elections are no longer merely ideological fights. Election integrity has become a subject of debate and targeted attacks. The concerned authorities need to guarantee transparency by laying bare the processes and workings of the system. For the devil is in the details. Pre-bunking, a strategy to provide factual information before the onslaught of disinformation, is a possible response. It could serve as an inoculation against malicious campaigns.
Climate action initiatives are also extremely susceptible to misinformation and disinformation as economic interests are directly involved. Voices expressing denial of climate change have emerged among the politically powerful. This has not merely polarised communities, but resulted in cutting down of funding for climate-positive action. Many global corporates have rolled back on net-zero commitments and resorted to greenwashing. This is an area of concern as it deepens economic inequalities and exploitative practices. An example of a considered response is the Global Initiative for Information Integrity on Climate Change launched by Unesco and Brazil.
Advertisements are often rife with misleading claims meant to boost sales. Disinformation has been known to be embedded in several products. Food and beverages claiming to increase attentiveness and height in children, cosmetics promising age reversal, herbal supplements building immunity, even car engines proclaiming they are environment-friendly are some examples. Some advertisers and manufacturers have retracted unsubstantiated claims due to consumer activism and class action.
A recent news about a Telangana doctor’s campaign against the claim of sugary beverages masquerading as oral rehydration supplements is a case in point. The paediatrician waged a lonely battle against such beverages that are harmful to children suffering diarrhoea. Many parents guided by the ORS labelling had unwittingly given children the supplements that had worsened their dehydration. In October 2025, the Food Safety and Standards Authority banned the use of the term ‘ORS’ in beverage trademarks or branding. Misinformation on food labels is commonplace, especially in the processed food industry. Generic claims, unsupported by research or evidence, need to be tackled by regulators and consumers.
The role of AI and social media in this is evident. Micro-targeting of groups based on online behaviour helps amplify disinformation. Recognising this threat, several countries have evolved collaborative models where civil-society groups, government, tech and media companies work together in identifying and debunking disinformation. It is also necessary to appreciate that disinformation exploits cognitive biases. It cherry-picks facts that it reinterprets to promote faith-based narratives. It is often characterised by conspiracy theories, exaggeration and intolerance and not open to scrutiny. Rebuttals rely on a fact-based approach and adopt a sober tenor. Encouraging diverse narratives and fact-checking can reduce the spread.
Social media companies like Meta have recently ended third-party fact-checking deals and propose to use community notes to identify misleading information. The efficacy of this process remains to be seen. Future research may lay bare the underlying networks of disinformation including the mechanics of echo chambers. Precise metrics may also emerge to measure the impact of disinformation on individuals and society.
Till such time, the words of W B Yeats will ring loud, “The best lack all conviction, while the worst are full of passionate intensity.” It is necessary to cultivate the habit of critical analysis, whether one is endorsing or rebutting. Passivity only abets the growth of disinformation.
Geetha Ravichandran
Former bureaucrat and author, most recently of The Spell of the Rain Tree
(Views are personal)