AI-yo, will deepfakes die of an overkill?

Thanks to fake videos, even dead people can now rise up and talk to you with new scripts, depriving dear Jesus Christ of the Easter advantage he has enjoyed for about two thousand years.
AI-yo, will deepfakes die of an overkill?
Picture credits: AFP

‘Crying wolf’ is an old idiom for mistaking a false call for a real one, as the fictional boy’s tale of making up threats from a wild animal taught us. You can’t be crying wolf all the time. In the Digital Age, we could be ‘Crying snow leopard’—which would take you to the Himalayan heights of fake visuals as artificial intelligence takes illusions to the next level.

Around All Fools’ Day this month, we saw the virally transmitted photo of a mountain girl, said to be from Gilgit-Baltistan in Pakistan Occupied Kashmir, sitting in front of snow-capped mountains next to a snow leopard that she was said to have raised since it was a cub. Only, it was fake. But this fakedom was not realised until the girl and her pet leopard had been cheerfully circulated.

Next, we had someone we thought was Pandit Ravi Shankar playing the sitar in a social media post by the State Bank of India. It wasn’t the maestro but someone with a face looking like an elongated version of the musician’s, and on closer look, the sitar turned out to be a guitar! SBI, already pummelled for its electoral bonds-linked postures, has since taken off the ad. There is only so much music you can face in a month, with or without a fake Ravi Shankar. And there is only so much string-pulling you can do before you are subjected to leg-pulling.

Those of us anticipating more AI-linked fakes this season have not been disappointed. With the general elections on amid rumours, innuendos and WhatsApp University machinations, even Chhindwara in Madhya Pradesh, which we thought was a tribal-rich Remotistan, now boasts of fake videos. An aide of Congress leader Kamal Nath in his backyard has been booked along with a TV journalist for circulating an allegedly fake obscene video of the BJP candidate in the constituency. This is serious in a season when candidates are best seen and not be obscene.

There is another fake video in which the CEO of the National Stock Exchange is said to be making recommendations on what shares to buy. Sebi’s old warning—“Read the offer documents carefully”—rings so hollow in the face of this; the market regulator may now have to say, “Don’t believe what you see and hear. It could be an apparition.” Meanwhile, fake videos of Elon Musk are said to have flooded YouTube in an attempt to steal cryptocurrency. Science fiction may have just met black comedy in a heist thriller.

Between social media and easily available AI software, visual manipulation is nearly as easy as cut-and-paste jobs on your screen. Thanks to fake videos, even dead people can now rise up and talk to you with new scripts, depriving dear Jesus Christ of the Easter advantage he has enjoyed for about two thousand years.

It gets curiouser when we realise some fakes can actually be authorised, official versions. In Tamil Nadu, the late leaders J Jayalalithaa and M Karunanidhi have both risen from the dead thanks to digital resurrection, and are continuing their old political enmity in deepfake videos that look and sound like them.

I won’t be surprised if there is a fake video that tells us that ancient Tamizh Nadu had savants who were presciently aware of the future threat from AI and hence coined a term like “AI-yo”—an essentially untranslatable expression that can generate a range of emotions from pity to caution, threat or fear.

You can now use Elvis Presley’s voice to sing K L Saigal’s soulful songs from the 1940s. But it is not funny when you hear that a video of Congress leader Rahul Gandhi was overlaid with an AI voice clone to claim he had resigned from the Grand Old Party.

All this is giving rise to a new heir-raising tale. We are told the New Age will now likely involve telling people how your likeness or voice may or may not be used after your demise. We may now have new generations singularly unimpressed by Hamlet’s father appearing as a ghost in Shakespeare’s play. Instead, we may have parodies in which the departed dad is only a deepfake hologram in a digitally retold post-modern version.

Fact-checkers will work hard in the short term to circulate reports on the authenticity of videos or audio clips purported to be showing or saying something profound. Editors might as well do this, as their erstwhile work, such as proofreading, grammar-checking and cutting-pasting are increasingly done by software. While philosophers and policymakers debate the ethics of it all, something tells me this is going to resemble the crying wolf tale. At some point, there is a reverse swing in which the propensity of the average citizen shifts from belief to mistrust.

Ernest Hemingway famously said: “The most essential gift for a good writer is a built-in, shockproof sh** detector.” It has been paraphrased as a virtue that journalists are supposed to have. With fakes becoming the new normal in the age of digitally-generated lookalikes and soundalikes, we may well have the mass production of this journalistic virtue. And we probably will, as the default option, shift from a willing suspension of disbelief to a ‘let us wait for confirmation’ mode.

Perhaps that is wishful thinking. But the crying wolf tale offers us hope that deepfakes may die of an overdose of AI-mongers’ making.

(Views are personal)

Madhavan Narayanan | Senior journalist

Related Stories

No stories found.
The New Indian Express
www.newindianexpress.com