Meta is taking steps to crack down on the spread of “revenge porn” images of teenagers on Facebook and Instagram.
A new tool, called Take It Down, takes aim at a practice where someone posts an explicit picture of an individual without their consent to publicly embarrass them. Revenge porn has skyrocketed in the last few years on social media, particularly among young boys, according to the National Center for Missing and Exploited Children.
Take It Down, which is operated and run by NCMEC, will allow minors for the first time to anonymously attach a hash – or digital fingerprint – to intimate images or videos directly from their own devices, without having to upload them to the new platform. To create a hash of an explicit image, a teen can visit the website TakeItDown.NCMEC.org to install software onto their device. The anonymized number, not the image, will then be stored in a database linked to Meta so that if the photo is ever posted to Facebook or Instagram, it will be matched against the original, reviewed and potentially removed.
“This issue has been incredibly important to Meta for a very, very long time because the damage done is quite severe in the context of teens or adults,” said Antigone Davis, Meta’s global safety director. “It can do damage to their reputation and familial relationships, and puts them in a very vulnerable position. It’s important that we find tools like this to help them regain control of what can be a very difficult and devastating situation.”
The tool works for any image shared across Facebook and Instagram, including Messenger and direct messages, as long as the pictures are unencrypted.
People under 18 years old can use Take It Down, and parents or trusted adults can also use the platform on behalf of a young person. The effort is fully funded by Meta and builds off a similar platform it launched in 2021 alongside more than 70 NGOs, called StopNCII, to prevent revenge porn among adults.
Since 2016, NCMEC’s cyber tip line has received more than 250,000 reports of online enticement, including “sextortion,” and the number of those reports more than doubled between 2019 and 2021. In the last year, 79% of the offenders were seeking money to keep photos offline, according to the nonprofit. Many of these cases played out on social media.
Meta’s efforts come nearly a year and a half after Davis was grilled by Senators about the impact its apps have on younger users, after an explosive report indicated the company was aware that Facebook-owned Instagram could have a toxic effect on teen girls. Although the company has rolled out a handful of new tools and protections since then, some experts say it has taken too long and more needs to be done.
Meanwhile, President Biden demanded in his latest State of the Union address more transparency about tech companies’ algorithms and how they impact their young users’ mental health.
In response, Davis told CNN that Meta “welcomes efforts to introduce standards for the industry on how to ensure that children can safely navigate and enjoy all that online services have to offer.”
In the meantime, she said the company continues to double down on efforts to help protect its young users, particularly when it comes to keeping explicit photos off its site.
“Sextortion is one of the biggest growing crimes we see at the National Center for Missing and Exploited Children,” said Gavin Portnoy, vice president of communications and branding at NCMEC. “We’re calling it the hidden pandemic, and nobody is really talking about it.”
Portnoy said there’s also been an uptick in youth dying by suicide as a result of revenge porn. “That is the driving force behind creating Take It Down, along with our partners,” he said. “It really gives survivors an opportunity to say, look, I’m not going to let you do this to me. I have the power over my images and my videos.”
In addition to Meta’s platforms, OnlyFans and Pornhub’s parent company MindGeek are also adding this technology into their services.
But limitations do exist. To get around the hashing technology, people can alter the original images, such as by cropping, adding emojis or doctoring them. Some changes, such as adding a filter to make the photo sepia or black and white, will still be flagged by the system. Meta recommends teens who have multiple copies of the image or edited versions make a hash for each one.
“There’s no one panacea for the issue of sextortion or the issue of the non-consensual sharing of intimate images,” Davis said. “It really does take a holistic approach.”
The company has rolled out a series of updates to help teens have an age-appropriate experience on its platforms, such as adding new supervision tools for parents, an age-verification technology and defaulting teens into the most private settings on Facebook and Instagram.
This is not the first time a major tech company has poured resources into cracking down on explicit imagery of minors. In 2022, Apple abandoned its plans to launch a controversial tool that would check iPhones, iPads and iCloud photos for child sexual abuse material following backlash from critics who decried the feature’s potential privacy implications.
“Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,” the company said in a statement provided to Wired at the time.
Davis did not comment on whether it’s expecting criticism for Meta’s approach, but noted “there were significant differences between the tool that Apple launched and the tool that NCMEC is launching today.” She emphasized Meta will not be checking for images on users phones.
“I do welcome any member of the industry trying to invest in efforts to prevent this kind of terrible crime from happening on their apps,” she added.