Takeaways
- Instagram is announcing measures to further protect people from sextortion, including hiding follower and following lists from potential sextortion scammers, preventing screenshots of certain images in DMs, and rolling out our nudity protection feature globally
- These updates, which are part of a campaign informed by NCMEC, Thorn & Childnet, also aims to help parents feel more equipped to support their teens in avoiding these scams
Sextortion is a horrific crime, where financially-driven scammers target young adults and teens around the world, threatening to expose their intimate imagery if they don’t get what they want. Today, we’re announcing new measures in our fight against these criminals – including new safety features to further help prevent sextortion on our apps, building on protections already in place.
New safety features to disrupt sextortion
Meta is announcing a range of new safety features designed to further protect people from sextortion and make it even harder for sextortion criminals to succeed.
Now, we’re making it harder for accounts showing signals of potentially scammy behavior to request to follow teens. Depending on the strength of these signals – which include how new an account is – we’ll either block the follow request completely, or send it to a teen’s spam folder.
Sextortion scammers often use the following and follower lists of their targets to try and blackmail them. Now, accounts we detect as showing signals of scammy behavior won’t be able to see people’s follower or following lists, removing their ability to exploit this feature. These potential sextorters also won’t be able to see lists of accounts that have liked someone’s posts, photos they’ve been tagged in, or other accounts that have been tagged in their photos.
Soon, we’ll no longer allow people to use their device to directly screenshot or screen record ephemeral images or videos sent in messages. This means that if someone sends a photo or video in Instagram DM or Messenger using our ‘view once’ or ‘allow replay’ feature, they don’t need to worry about it being screenshotted or recorded in-app without their consent. We also won’t allow people to open ‘view once’ or ‘allow replay’ images or videos on Instagram web, to avoid them circumventing screenshot prevention.
We’re constantly working to improve the techniques we use to identify scammers, remove their accounts and stop them from coming back. When our experts observe patterns across sextortion attempts, like certain commonalities between scammers’ profiles, we train our technology to recognize these patterns. This allows us to quickly find and take action against sextortion accounts, and to make significant progress in detecting both new and returning scammers. We’re also sharing aspects of these patterns with the Tech Coalition’s Lantern program, so that other companies can investigate their use on their own platforms.
Finally, after first announcing the test in April, we’re now rolling out our nudity protection feature globally in Instagram DMs. This feature, which will be enabled by default for teens under 18, will blur images that we detect contain nudity when sent or received in Instagram DMs and will warn people of the risks associated with sending sensitive images. We’ve also worked with Larry Magid at ConnectSafely to develop a video for parents, available in the Meta Family Center’s Stop Sextortion page, that explains how the feature works.
This campaign and new safety features are in addition to our recent announcement of Teen Accounts, which gives tens of millions of teens built-in protections that limit who can contact them, the content they see and how much time they’re spending online. Teens under 16 aren’t able to change Teen Account settings without a parent’s permission.
With Instagram Teen Accounts, teens under 18 will be defaulted into stricter message settings, which mean they can’t be messaged by anyone they don’t follow or aren’t connected to. In the EU, we will start placing teens into Teen Accounts later this year and in the rest of the world Teen Accounts will be available from January.
Taking action against sextortion criminals
Last week, we removed around 1,600 Facebook Groups and accounts that were affiliated with Yahoo Boys, and were attempting to organize, recruit and train new scammers. This comes after we announced in July that we’d removed around 7,200 Facebook assets that were engaging in similar behavior. Yahoo Boys are banned under Meta’s Dangerous Organizations and Individuals policy — one of our strictest policies — which means we remove Yahoo Boys’ accounts engaged in this criminal activity whenever we become aware of them. While we’ve been removing violating Yahoo Boys accounts for years, we’re putting new processes in place which will allow us to identify and remove these accounts more quickly.
We’ll continue to evolve our defenses to help protect our community from sextortion criminals. This includes helping teens and their families recognize these scams early, preventing potential scammers from reaching their targets, and working with our peers to fight these criminals across all the apps they use.
What do you think?
It is nice to know your opinion. Leave a comment.