Sun. 22 Feb 2026 ❄️ -1°C in Columbus

Instagram Still a Threat to Children Despite Safety Updates, Say Campaigners

Date: 23-apr-2025 | By: Nuztrend Team

Instagram Still a Threat to Children Despite Safety Updates, Say Campaigners

Photo by Shutter Speed on Unsplash

April 23, 2025 — While Meta continues to roll out new safety features aimed at protecting minors on Instagram, leading child safety campaigners warn that the platform remains fraught with dangers for children and teenagers. The criticism highlights gaps in content moderation and the ongoing presence of harmful material that puts young users at risk.

Meta’s Recent Measures Fall Short

In recent months, Meta introduced several updates designed to enhance teen safety. These include default private account settings for users under 16, AI-based age verification tools, and a ban on livestreaming for underage accounts without parental approval.

However, according to The Guardian, critics argue that these changes are not enough. Campaigners claim that Instagram’s core design and algorithmic content delivery continue to expose children to inappropriate, graphic, or even exploitative content.

Alarming Gaps in Moderation

A joint investigation published earlier this year uncovered the circulation of AI-generated child abuse imagery on Instagram that evaded platform detection tools. Despite ongoing efforts by Meta’s moderation teams, loopholes in automated content filtering have raised fresh concerns among safety experts and parents.

In response, regulators are stepping in. The UK’s media watchdog, Ofcom, is preparing to enforce tighter controls under the Online Safety Act, including mandatory age checks and stricter takedown timelines. As Financial Times reports, these regulations aim to hold social platforms accountable for failing to protect young users.

What Campaigners Want

  • Stronger algorithm controls to prevent harmful content from being recommended to teens
  • Human-led moderation of high-risk content categories
  • Independent audits of platform safety practices
  • Full transparency reports on safety incidents involving minors

Organizations like the Molly Rose Foundation have repeatedly called Meta's moderation policy changes “deeply concerning,” warning that failure to take stronger action could lead to further mental health crises among teens.

Looking Ahead

With regulators watching closely and campaigners raising the alarm, Instagram faces increasing pressure to go beyond cosmetic fixes. As the platform continues to evolve, experts insist that child protection must become a foundational pillar—not an afterthought—in the social media landscape.

Disclaimer: This article is based on publicly available information from various online sources. We do not claim absolute accuracy or completeness. Readers are advised to cross-check facts independently before forming conclusions.

💬 Leave a Comment



Enter Captcha:
586217


📝 Recent Comments

No comments yet! Be the first one to comment.

🔄 Read More

📌 Latest Trending