In a major move to ensure the safety and well-being of teenage users, Meta announced on Tuesday its decision to intensify content restrictions on Facebook and Instagram.
The social media giant, formerly known as Facebook, is set to implement measures that will curtail the visibility of sensitive and "age-inappropriate" content for teenagers on its platforms.
According to the company's official statement, Meta is expanding its existing restrictions on content related to self-harm, eating disorders, and mental illnesses.
Initially limited to Reels and Explore pages, these constraints will now extend to teens' regular feeds and Stories, even if the content originates from accounts they follow.
As part of the new update, Meta is also taking steps to hide more search results and terms associated with suicide, self-harm, and eating disorders for all users. The company aims to direct individuals searching for such content to appropriate resources for help.
"We want teens to have safe, age-appropriate experiences on our apps," Meta emphasized in a blog post detailing the upcoming changes. The default content control settings for teens on Facebook and Instagram will be the most restrictive, limiting their exposure to potentially harmful material.
In a bid to further enhance privacy and security, Meta will actively encourage teens through notifications and prompts to update their account settings for increased privacy.
This decision comes amidst heightened scrutiny of Meta's impact on young users. In 2022, the company faced a lawsuit from a family whose teenage daughter was exposed to content glorifying anorexia and self-harm on Instagram.
The "Facebook Papers," leaked in 2021, also revealed internal research indicating Meta's awareness of Instagram's negative effects on teen girls.
In a congressional hearing in November, a former engineering director and consultant for Meta underscored the necessity for the company to do more to protect children online.
Furthermore, a bipartisan group of 33 state attorneys general sued Meta in October, alleging that the company incorporated addictive features targeted at young users.