quickutilities – Meta has extended its “Teen Accounts” feature to Facebook and Messenger, months after its debut on Instagram. This expansion brings stricter privacy settings, content filters, and parental oversight tools to teens aged 13 to 15. Initially available in the US, UK, Australia, and Canada, the feature will roll out globally in the coming months.
Teen Accounts were developed in response to growing concerns about the impact of social media on young users’ mental health and online safety. Meta aims to create a safer digital environment by limiting teens’ exposure to harmful content and preventing unwanted interactions.
Under the new system, teens are automatically enrolled in a restricted app experience. The feature limits interactions to friends or users they have previously contacted. This means strangers cannot message, comment on posts, or interact unless a prior connection exists.
These measures, combined with customizable parental controls, reflect Meta’s broader effort to prioritize teen safety and rebuild public trust in its platforms.
Meta Reports Strong Teen Account Adoption
Meta reports encouraging results from its Instagram Teen Accounts feature as it prepares to expand the tool to Facebook and Messenger. Since launching the feature last year, Meta has transitioned 54 million teens globally into the restricted Teen Account experience. The company says 97% of users aged 13 to 15 have kept the default safety settings, indicating strong user acceptance.
Meta commissioned research firm Ipsos to evaluate the feature’s impact. According to the findings, 94% of surveyed parents said Teen Accounts supported their efforts to manage their children’s online experiences. Additionally, 85% of parents agreed the feature made fostering positive digital habits easier.
Teen Accounts offer built-in protections that limit direct contact from unknown users and filter potentially harmful content. Users under 16 must obtain parental consent to weaken any default safeguards. Meta designed these controls to deliver age-appropriate experiences while giving parents greater oversight of their children’s social media activity.
Expands Teen Protections With New Parental Tools And Instagram Safety Features
Meta has announced new safety updates for teens across its platforms, expanding its Teen Accounts protections to Facebook and Messenger. The company says these changes aim to provide greater peace of mind for parents and create safer digital spaces for young users. The expansion builds on the success of the Instagram Teen Accounts feature, which already restricts interactions and content exposure for users under 16.
In addition to the rollout, Meta has introduced several new safety measures on Instagram. Teens under 16 will now require parental approval before starting a live broadcast or disabling the nudity protection feature. This feature automatically blurs suspected nude images in direct messages, offering an added layer of protection for minors.
Addressing concerns over screen time and mental health, Meta has also launched usage management tools. These include daily reminders after one hour of app usage and the automatic activation of Quiet Mode at night, silencing notifications to discourage late-night scrolling. Meta says these features reflect its broader commitment to safer and healthier digital engagement for teens.