Meta to strengthen the protection of children and adolescents on social media platforms
Social media giant Meta said on Tuesday it was rolling out a series of measures to tighten the safety of young users on its Instagram platform, the latest company to tackle the problem.
Activists have long criticized tech giants for failing to protect teens from harmful content, and Instagram’s popularity among young people has put it firmly in the firing line.
Meta, which also owns Facebook and WhatsApp, said parents and guardians could set time limits for children’s Instagram scrolling.
And younger users would now see nudges encouraging them to watch other topics if they spend too much time watching content on a single topic.
“It is crucial for us to develop tools that respect the privacy and autonomy of young people while involving parents in the experience,” said Clotilde Briend of Meta during a press briefing.
Instagram was rocked last year by revelations from whistleblower Frances Haugen that suggested executives were aware the platform could harm the mental health of young users, especially teenage girls.
Meta has always denied those claims, but has since faced a series of grillings in the US Congress and suggestions that regulation may be on the way.
Other apps, including video-sharing platform TikTok, have also come under fire over fears young people will find it hard to tear themselves away from content.
Last week, TikTok announced that young people would get nudges to remind them to pause scrolling – similar to an Instagram feature that has already rolled out.
On Tuesday, Meta also announced new measures for its virtual reality headsets.
Parents and guardians will be able to block apps, see what their child is watching on another device, and see how much time their child is spending with their headset.