Meta is broadening its Teen Accounts initiative — an experience deemed age-appropriate for users under 18 — to include Facebook and Messenger.
This system automatically places younger teens on these platforms into more controlled settings, requiring parental consent to live stream or disable message image protections.
Originally launched last September on Instagram, Meta claims it has “fundamentally altered the experience for teens” on that platform. However, advocates argue that it remains unclear what the actual impact of Teen Accounts has been.
The functioning of Teen Accounts is dependent on the user’s self-reported age. Users aged 16 to 18 can deactivate default safety features like keeping their accounts private. Conversely, those aged 13 to 15 must get parental consent to disable such features, which can only be done by adding a parent or guardian to their profiles.

Meta reports that since their introduction in September, at least 54 million teens worldwide have transitioned into teen accounts. The company also states that 97% of 13 to 15-year-olds have retained its built-in safety measures.
This system relies on users being honest about their age when creating accounts, with Meta utilising methods like video selfies to verify this information.
In 2024, it plans to employ artificial intelligence (AI) to identify teens who may be misrepresenting their ages, to reassign them to Teen Accounts.