Instagram in the UK Launches New Parental Control Measures

Instagram in the UK Launches New Parental Control Measures
Instagram is rolling out new parental control measures across the U.K. Meta, its parent company, has been facing criticism for its negative impact on young users and its inaction towards it. Getty images

Instagram is rolling out parental control features across the U.K. on Tuesday, which includes the option to set a daily time limit for teenagers.

The new feature allows parents of teenagers under 18 to set daily time limits between 15 minutes and two hours. Once the user reaches the specific time limit, a black screen will appear on Instagram for the rest of the day. Parents can also schedule break time for their kids.

They will also receive an alert when their child reports an account or post and the reason why they reported it, BBC says. They may also view their child's daily habits on Instagram, but they will need the child's permission to activate the supervision features.

Meta, Instagram's parent company, announced that the minimum age to use Instagram is 13. The company also added a parent dashboard on all Quest virtual reality headsets, allowing parents to invite their kids to the new feature.

The virtual reality limits cover purchase approval, blocking apps, and the ability to see their children's friend lists.

The parental controls will automatically be disabled when the child turns 18.

Last year, Instagram attempted to create a platform for children below 13, but it received a backlash, prompting the social media platform to pause its plans.

Measures in response to criticisms

The new Instagram features were likely prompted by mounting pressure on social media to regulate addictive social media among teens. The features were first introduced in the U.S. in March.

Instagram and Facebook have been accused of inflicting harm on children since whistleblower Frances Haugen, a former Facebook employee, revealed that Meta had been aware that its platform could be harmful to children's mental and physical health.

In 2017, Molly Rusell, 14, killed herself after viewing self-harm and suicide content on Instagram. During the inquest, it was revealed that she used her account more than 120 times a day in the last six months of her life.

Per the Wall Street report, Facebook, WhatsApp, and Instagram conducted research that found that teenagers blamed Instagram for increasing anxiety and depression, but Meta reportedly kept the study a secret. In response, Instagram said that the story focused "on a limited set of findings" and placed the company in a "negative light."

As per ETXView, Instagram has faced criticism for its impact on young users and its inaction.

Eight lawsuits have also been filed against Meta, claiming that excessive exposure to Facebook has led to actual suicide, eating disorders, sleeplessness, and other issues.

Parental Control Measures of other social media platforms

Other social media platforms have also launched similar features recently. Tiktok, for example, introduced new screen time control features last week. It now encourages users to regulate the user's continuous scrolling by allowing users to set custom limits for how much time they want to spend on the app. Tiktok also issues users aged 13 to 17 with "digital well-being prompts" when they have used the app for more than 100 minutes in a day, Arab News reports

© 2024 ParentHerald.com All rights reserved. Do not reproduce without permission.

Join the Discussion
Real Time Analytics