On Tuesday, technology giant Meta announced a series of new safety measures set to protect its underage users from encountering dangerous content on its social media platforms.
Meta will reportedly begin censoring “age-inappropriate content,” such as posts centered around suicide idealization or eating disorders, for teens as well as automatically place minor users into a recommended restrictive content setting.
According to Meta, the company wants “teens to have safe, age-appropriate experiences on our apps.”
The move comes as Meta has faced intense scrutiny over recent months for knowingly implementing policies that negatively impact its minor users’ mental health.