Meta Inc has rolled out new privacy updates for teenagers on Instagram and Facebook, to protect them from online harm.
Starting now, anyone who joins Facebook under the age of 16, or under the age of 18 in some countries, will automatically be placed in more private settings, the company said in a blog post.
“We’re now testing ways to protect teens from messaging suspicious adults they aren’t connected to, and we won’t show them in teens’ People You May Know recommendations. A ‘suspicious’ account is one that belongs to an adult that may have recently been blocked or reported by a young person, for example. As an extra layer of protection, we’re also testing removing the message button on teens’ Instagram accounts when they’re viewed by suspicious adults altogether,” reads a note from Meta.
Moreover, the company also created a number of tools for teens to inform the company if something makes them feel uncomfortable while using the applications.
Meta is also developing tools to prevent the online spread of self-generated intimate photographs. “We’re working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent,” the company said.
Additionally, Meta is also working with Thorn and their NoFiltr brand to create educational materials that “reduce the shame and stigma surrounding intimate images, and empower teens to seek help and take back control if they’ve shared them or are experiencing sextortion”.