Meta, the parent company of Facebook and Instagram, has announced plans to only show teenagers age-appropriate content on both platforms.
In a recent blog post, Meta said teenagers will be placed into “the most restrictive” content control settings on Instagram and Facebook.
The social media giant added that it will hide certain types of content and restrict the use of additional terms in the ‘Search’ and ‘Explore’ features.
It said content relating to suicide, self-harm, and eating disorders will become harder to find and will not be recommended.
Meta said the development aims to make the social media platforms safe and age-appropriate for young people.
This, it said, is in line with the guidance of experts in adolescent development, psychology, and mental health.
“We want teens to have safe, age-appropriate experiences on our apps,” Meta said.
“We have developed more than 30 tools and resources to support teens and their parents, and we have spent over a decade developing policies and technology to address content that breaks our rules or could be seen as sensitive.
“We are announcing additional protections that are focused on the types of content teens see on Instagram and Facebook.
“Now, when people search for terms related to suicide, self-harm and eating disorders, we will start hiding these related results and will direct them to expert resources for help.”