Instagram, under its parent company Meta, announced the introduction of new "Teen Accounts" that will automatically place all users under 18 into a more private and restrictive environment. This sweeping change, which begins rolling out this week, is part of Meta's broader efforts to address growing concerns over the safety and well-being of teenagers on its platforms.
The new teen accounts will incorporate enhanced privacy settings, limiting who can interact with these users. According to Meta, teens will be placed in private accounts by default, meaning they can only be messaged, tagged, or mentioned by people they already follow. This marks a departure from previous practices where such privacy settings were optional, now becoming the standard for all users under 18.
Naomi Gleit, Meta's head of product, emphasized the significance of these changes during an interview, stating, "Everyone under 18, creators included, will be put into teen accounts. They can remain public if their parent is involved and gives them permission and is supervising the account. But these are pretty big changes that we need to get right."
This latest initiative follows a series of public criticisms and legal challenges aimed at Meta regarding the safety of minors on its platforms. In January, Meta CEO Mark Zuckerberg publicly apologized to parents during a Senate hearing on online child safety, where Instagram was accused of contributing to issues such as exploitation and suicide among teenagers.
In response to such criticisms, Meta has gradually rolled out various safety features over the years, including parental supervision tools. However, these measures have often been fragmented and inconsistently applied, leading to calls for more comprehensive and standardized protections. Gleit acknowledged this in her remarks, noting, "We've gotten a lot of feedback from parents, mostly about how things could be simpler, easier to use, and more consistent. That's what this launch is aiming to address."
One of the new features of these teen accounts is a "Sleep Mode," which silences notifications and sends autoreplies during designated hours-defaulted to 10 p.m. to 7 a.m.-to encourage healthier usage patterns among teens. Additionally, a "Daily Limit" prompt will remind teens to close the app after 60 minutes of use, with parental supervisors having the option to set stricter time limits.
Instagram's new privacy and content restrictions also include the "Hidden Words" feature, which will automatically filter out offensive language from comments and direct messages. Teens will also be placed in the most restrictive content setting, limiting exposure to potentially sensitive content from accounts they do not follow.
Meta is aware that some teens might attempt to bypass these restrictions by lying about their age. To counter this, the company is deploying advanced age verification methods. Users attempting to change their age from under 18 to over 18 will be required to provide additional verification, such as a government ID or a video selfie. Gleit mentioned that Meta is also developing AI-driven tools to predict whether users claiming to be adults might actually be teens, based on their behavior and interactions on the platform.
Despite the rollout of these features, there are concerns about how effectively they will be enforced. Meta has admitted that it expects some teens to find workarounds, and the company is preparing for this by enhancing its detection and verification processes. "We know some teens are going to try to lie about their age to get around these protections," said Antigone Davis, Meta's global head of safety. She added that the company is taking a "multi-layered approach" to age verification, given the complexities involved.
The introduction of these teen accounts is part of Meta's broader response to increasing scrutiny from lawmakers and advocacy groups. Following the whistleblower revelations from Frances Haugen in 2021, which exposed internal Meta research on the negative impacts of its platforms on teen mental health, there has been growing legislative pressure to regulate social media companies more tightly. In July, the U.S. Senate passed significant online child safety legislation, further highlighting the urgency of the issue.
While Meta's new measures have been welcomed by some as a step in the right direction, others remain skeptical. Critics argue that while these tools are necessary, they may not be sufficient to fully protect teens from the myriad dangers of social media. The effectiveness of these features will largely depend on their implementation and Meta's commitment to ongoing enforcement.
As these changes begin to roll out in the U.S., U.K., Australia, and Canada, Meta plans to expand the teen accounts to the European Union later this year and across its other platforms in 2025. The company has yet to provide detailed timelines for when all existing teen accounts will be fully transitioned to the new settings.