Facing intense scrutiny over the safety of its platform for teenagers, Instagram is introducing a new set of controls based on the well-known PG-13 film rating system. This measure is being presented as a way to give parents more direct influence over their children’s social media experience and to curb their exposure to mature themes.
The platform will now default all users aged under 18 into a specially designed 13+ content setting. This setting is more restrictive than the standard user experience, and a parent’s approval will be mandatory for a teen to switch to a less filtered version of the app. This represents a significant step towards a more “safety by default” model for adolescent users.
The restrictions of the PG-13 mode go beyond current safeguards. It will actively demote or hide content with profanity, risky stunts, and imagery that promotes harmful behaviors, such as showing drug paraphernalia. The company also confirmed it will block specific search terms related to violence and substance use, making it harder for teens to discover such content.
This initiative follows a damning report from independent researchers, including a former Meta whistleblower, which asserted that Instagram’s existing safety tools are largely ineffective. Meta has publicly rejected these findings, but the new PG-13 system appears to be a direct response to the mounting pressure from both critics and regulatory bodies demanding stronger action.
While the updates are set to launch soon in the US, UK, Australia, and Canada, with a global rollout to follow, safety advocates like the Molly Rose Foundation remain skeptical. They argue that without allowing independent testing and full transparency, Meta’s announcements are difficult to distinguish from strategic PR, and the true impact on teen safety remains to be seen.
Under Fire, Instagram Adopts PG-13 Model Amidst Safety Criticisms
11