Instagram Enhances Teen Safety with New Default Privacy Settings
Instagram implements stricter privacy measures for teen accounts, including default private settings and content restrictions. The move aims to address concerns about social media's impact on youth mental health.
In a significant move to enhance online safety for younger users, Instagram is implementing new default privacy settings for teen accounts. This initiative, which began on September 17, 2024, in several countries including the United States, United Kingdom, Canada, and Australia, aims to address growing concerns about the impact of social media on youth.
The new measures include setting accounts for users under 18 to private by default, restricting direct messages, and limiting exposure to sensitive content. These changes will be applied to new accounts immediately, while existing teen accounts will be migrated over the following 60 days. European Union users can expect similar adjustments later in the year.
Naomi Gleit, head of product at Meta, explained the rationale behind these changes:
"The three concerns we're hearing from parents are that their teens are seeing content that they don't want to see or that they're getting contacted by people they don't want to be contacted by or that they're spending too much time on the app. So teen accounts is really focused on addressing those three concerns."
To combat excessive screen time, Instagram will notify teens after 60 minutes of app usage and implement a "sleep mode" that disables notifications between 10 p.m. and 7 a.m. While 16 and 17-year-olds can opt out of these settings, younger users will require parental permission to do so.
These updates come as Meta, Instagram's parent company, faces legal challenges from multiple U.S. states accusing it of contributing to the youth mental health crisis. The company acknowledges that teens may attempt to circumvent age restrictions and is developing technology to proactively identify and restrict teen accounts pretending to be adult users.
Critics, however, remain skeptical about the effectiveness of these measures. Nicole Gil, co-founder of Accountable Tech, argued that the announcement falls short of addressing fundamental issues:
"Today's PR exercise falls short of the safety by design and accountability that young people and their parents deserve and only meaningful policy action can guarantee. Meta's business model is built on addicting its users and mining their data for profit; no amount of parental and teen controls Meta is proposing will change that."
Since its launch in October 2010, Instagram has grown to over 2 billion monthly active users as of 2023. The platform has faced ongoing scrutiny for its impact on mental health, especially among young users. In response, Instagram has introduced various features over the years to help users manage their time on the app and combat issues like cyberbullying.
While these new safety measures represent a step towards addressing concerns, the debate continues about the broader implications of social media on youth well-being and the effectiveness of self-regulation in the tech industry.