Meta Makes It More Difficult for Teens to See Harmful Content
Meta Makes It More Difficult for Teens to See Harmful Content
Meta Makes It More Difficult , for Teens to See Harmful Content.
On Jan.
9, Meta announced that it will begin hiding some content from teens under the age of 18 on both Instagram and Facebook, NPR reports.
.
On Instagram, certain search terms will be restricted.
Now, when people search for terms related to suicide, self-harm and eating disorders, we'll start hiding these related results and will direct them to expert resources for help, Meta, via blog post.
The changes come as Meta faces calls to make social media safer for children, in addition to several state lawsuits and "possible federal legislation," NPR reports.
.
While many advocates say that the new policies are a step in the right direction.
Jean Twenge, a psychology professor at San Diego State University, says that they aren't foolproof.
.
You do not need parental permission to sign up for a social media account.
You check a box saying that you're 13, or you choose a different birth year and, boom, you're on, Jean Twenge, a psychology professor at San Diego State University, via NPR.
Twenge says that studies have highlighted the link between social media use and increased depression and self-harm rates.
In fact, teens who spend large amounts of time on social media are nearly twice as likely to suffer from depression or inflict harm on themselves compared to moderate social media users.
There's clearly a relationship with spending too much time on social media and then these negative outcomes, Jean Twenge, a psychology professor at San Diego State University, via NPR.
According to a Meta spokeswoman, the company is attempting to improve its age verification technology to better detect when kids don't provide their true age