

NEW DELHI: Meta Platforms on Thursday announced a major update to Instagram aimed at strengthening online safety for teenagers in India, introducing enhanced protections and stricter content controls by default.
The move centres on an expansion of Instagram’s revamped “Teen Accounts,” which draw on movie-style age-rating frameworks and feedback from parents. Under the new system, users under 18 will automatically be placed into a 13+ safety setting and will not be able to opt out without parental approval.
According to Meta, the updated settings are designed to limit teenagers’ exposure to potentially harmful material. Content featuring strong language, risky stunts, or themes that could encourage unsafe behaviour will be filtered out or not recommended altogether.
The company said it will also step up enforcement by proactively identifying and restricting content that violates its age-appropriate guidelines using improved detection technology. Teen users will be prevented from following accounts that consistently share inappropriate content and will not be able to interact with such material.
In addition, Instagram is introducing a stricter “Limited Content” mode. This setting further tightens controls by filtering a wider range of content and disabling features such as viewing, posting, or receiving comments on posts.
Meta acknowledged that no moderation system is flawless but emphasised its commitment to continuous improvement. The company said the changes are intended to reassure parents while giving them greater oversight of their children’s online experience.
The update reflects a broader push by Instagram to prioritise teen safety through responsible content curation, aligning platform policies with established age-rating standards and incorporating parental input into product design.
(With inputs from ANI)