[ad_1]
What you need to know
- Meta is prioritizing age-appropriate content for teenagers on Instagram and Facebook, guided by expert advice.
- As a result, Instagram will automatically set the most restrictive content controls for teens, and content related to self-harm narratives will be hidden from teenagers.
- These adjustments apply to all users under 18 on Instagram and Facebook, with the full implementation expected in the coming months.
Instagram and Facebook are taking a stand against tough topics like suicide and eating disorders for teens, meaning young users will stop seeing those posts, even from friends.
Meta’s latest overhaul of its privacy and safety tools, announced by the company Tuesday, is designed to ensure teens using its platforms guarantee a more age-appropriate experience, which should help better safeguard their wellbeing.
The new updates will see Meta remove content related to topics such as self-harm and eating disorders from teens’ Instagram and Facebook feeds. In addition to hiding content in sensitive categories, teen accounts will also be defaulted to restrictive filtering settings that tweak what kind of content on Facebook and Instagram they see.
If a teen searches for this type of content on Facebook and Instagram, they’ll instead be directed toward “expert resources for help” like the National Alliance on Mental Illness, according to Meta. Teen users also may not know if content in these categories is shared and that they can’t see it.
This change affects recommended posts in Search and Explore that could be “sensitive” or “low quality,” and Meta will automatically set all teen accounts to the most stringent settings, though these settings can be changed by users.
[ad_2]
Source link