New privacy defaults for teens on Facebook
Meta announced that starting today (November 21), everyone who is under the age of 16 (or under 18 in certain countries) will be defaulted into more private settings when they join Facebook. These default private setting include limitations on:
- Who can see teens’ friends list
- Who can see the people, Pages and lists they follow
- Who can see posts they’re tagged in on their profile
- Reviewing posts they’re tagged in before the post appears on their profile
- Who is allowed to comment on their public posts
- Product mock of privacy default notifications and settings on Facebook
Limiting unwanted interactions
Apart from default pricey settings, Meta says it is also bringing some changes in how interactions with teens are done on Facebook and Instagram. The company already restricts adults from messaging teens they aren’t connected to or from seeing teens in their ‘People You May Know’ recommendations on both platforms.
“In addition to our existing measures, we’re now testing ways to protect teens from messaging suspicious adults they aren’t connected to, and we won’t show them in teens’ People You May Know recommendations. As an extra layer of protection, we’re also testing removing the message button on teens’ Instagram accounts when they’re viewed by suspicious adults altogether,” the company said in a blogpost.
A “suspicious” account could be the one that belongs to an adult who may have recently been blocked or reported by a young person.
New tools to stop spreading of teens’ intimate images
Meta is also introducing updates to save teens from ‘sextortion’ on both the platforms. “The non-consensual sharing of intimate images can be extremely traumatic and we want to do all we can to discourage teens from sharing these images on our apps in the first place,” the company said.
Meta says it is working with US-based non-profit organisation National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens. This platform will be aimed to prevent the non-consensual sharing of intimate images for adults.
Furthermore, the company is also working with another non-profit organisation Thorn and their NoFilter brand to create educational material that reduces the shame and stigma surrounding intimate images.