In recent years, Meta has introduced over 50 tools, resources and features to protect young people online. To help teens have safe online experiences, Meta is testing new features aimed at protecting young people against sextortion and intimate image abuse.
Meta is taking steps to prevent potential “sextorters” from finding and interacting with teens on Instagram and plans to introduce a new nudity protection feature to detect when images sent or received in Instagram Direct Messenger contain nudity.
Here’s how this feature will work:
— After detecting nudity in the image, the platform will automatically blur nude images with a warning screen and remind users of the potential risks of sharing sensitive images.
— There will also be a message encouraging the user not to feel pressure to respond, with an option to block the sender and report the chat.
— Instagram will show a notification to adults encouraging them to turn this feature on and the feature will be turned on by default for teens under 18.
The nudity protection feature uses on-device machine learning to analyze whether the image sent in a Direct Message contains nudity. This analysis occurs on the device itself, so it works in end-to-end encrypted chats and does not allow Meta access to these images unless the user reports the chat. The nudity protection feature will soon be available worldwide.
A company representative said: “At Meta, we’ve developed more than 50 tools, resources and features to help Florida teens create, explore and connect on our apps in safe, age-appropriate ways. This includes built-in protections for teens, like private accounts and messaging restrictions, as well as supervision features so Florida parents can set the right boundaries for their teens.”
The new feature builds on Meta’s previous work to protect teens from unwanted contact and remove intimate images sextorters spread online. The company partnered with the National Center for Missing and Exploited Children (NCMEC) last year to develop its Take it Down platform, which lets young people take back control of their intimate images and prevents them from being shared online.
Meta is committed to implementing teen safety features on their platforms to protect young people. In addition, Meta continues to hire specialists dedicated to child safety and shares information with industry peers and law enforcement.
To learn more about this new feature and other safety measures Meta introduced to their platforms, visit Meta’s overview here.