

Big Tech, companies like Meta, which owns Instagram, Facebook and WhatsApp, and Alphabet that owns Google and YouTube, have ensnared millions of children and young people in a labyrinth of online games and habits.
Despite thousands of lawsuits and reams of anecdotal evidence, the power of Big Tech has ensured these issues were denied and obfuscated. But now that might change. In two lawsuits in the United States (US), jurors have ruled against these companies penalizing them with huge damages for bringing on social media addiction and mental illness intentionally.
Last Wednesday, a Los Angeles jury handed down $6 million in damages to a young woman who had sued Meta and YouTube for deliberately harming the 20-year old’s mental health. The epoch-making judgement removed the shield behind which these companies had hidden for years. Section 230, a federal US law, had given them immunity from third party content posted on their platform.
New grounds upheld
Filed by a young woman named Kaley, the LA case broke new ground. The tech companies were held accountable for harm caused by the deliberate design of their platforms rather than the content they host. The jury upheld Kaley’s arguments that Instagram and such like were designed to draw youngsters into endless scrolling feeds, and beauty filters. The algorithms were engineered to make these platforms addictive.
A day before the LA order, a New Mexico jury, also in the US, found Meta had violated a state law by not warning users of the dangers the platform posed to children from sexual predators. Meta was held liable for willfully engaging in “unfair and deceptive” trade practices, and ordered to pay $375 million in damages.
Spokespersons for Meta and Google have vowed to appeal. They assert no general conclusions can be drawn as each case is different. Despite the brave talk, these two rulings are watershed moments for Big Tech which so far functioned with impunity. As many have put it: It is the Big Tobacco Moment for the Digital World.
“This verdict sends a clear message to an entire industry that the era of operating without consequence is over,” Mark Lanier, lead trial counsel for the plaintiff, said in a statement.
These rulings will bear significance beyond U.S.’ shores. The burgeoning digital class of Indian youth too has not escaped this generational tunnel.
Horrific cases in the recent past have barely created a flutter. On February 4, this year, 3 minor sisters, aged 12, 14 and 16, from Ghaziabad, U.P. died by suicide after months of addiction to a online mobile game. Police investigations continue but nobody has hauled up the gaming app that triggered this mess.
Pranshu, a queer artist, died by suicide in 2023 following intense online trolling and cyber bullying after posting a reel on Instagram wearing a sari.
The crisis is much larger in India than is perceived. A recent survey has shown 36.9% of college students displaying addiction-related behavior including anxiety, depression, and low self-esteem. The addiction is rampant among 15–24 year-olds, who comprise 71% of India’s internet users.
Legislation needed
Despite social media addiction being widespread in India, there is hardly any legal intervention calling out the role of tech companies. There are no successful lawsuits against the tech companies that operte these apps. In April, last year, the Supreme Court dismissed a PIL seeking a ban on use of social media by children under 13. The apex court held it was a policy matter for legislation and not not a judicial call.
While judicial rulings, like the one in Los Angeles, will make punitive claims easier against tech companies, ultimately legislation is the way forward. Laws and rules will have to be put in place to monitor online platforms and keep young people safe. Deterrent fines on companies that violate the law will play a big role.
Big Tech companies can hardly be expected to fall in line voluntarily. The commercial value of a market place of young people is irresistible. Those under 18 spend more time on the internet, and are aggressive consumers of digital products. Social media platforms Facebook, Instagram, SnapChat, TikTok, X, and YouTube collectively derived $11 billion in advertising revenue from US-based users younger than 18 in 2022, according to a new study led by the Harvard TH Chan School for Public Health.
Ironclad laws will therefore have to be the big protective wall. Australia was the first to bring in protective legislation for children under 16. It’s Online Safety Amendment (Social Media Minimum Age) Act 2024 went into effect on 10 December, last year, forcing major platforms to deactivate millions of underage accounts. Denmark too has passed a law to protect kids, while the UK, Spain, France and Malaysia are at an advanced stage of passing similar legislation.
In India, despite the ballooning problem, there is no determined move to legislate a ban on use of social media for children. The Digital Personal Data Protection (DPDP) Act, 2023 mandates verifiable parental consent for users under 18 and prohibits platforms from tracking children or using targeted advertising.
Sadly, cases of addiction and depression are seen as individual aberrations to be treated by counselling and parental control. There are no parameters set to make social media platforms culpable for endangering mental health of young users. The faster mental illness of GenZ is seen as the residue of corporate greed, the quicker will come the solutions.