Facebook parent company Meta has funded a new platform designed to address these concerns, allowing young people to proactively scan a select group of websites for their images online and have them taken down. Run by the National Center for Missing & Exploited Children, Take It Down assigns a “hash value” or digital fingerprint to images or videos, which tech companies use to identify copies of the media across the web and remove them. Participants include tech companies, like Instagram, Facebook and pornographic websites, including Onlyfans and Pornhub.
“Having a personal intimate image shared with others can be scary and overwhelming, especially for young people,” Antigone Davis, Meta’s global head of safety, said in a statement announcing the effort. “It can feel even worse when someone tries to use those images as a threat for additional images, sexual contact or money — a crime known as sextortion.”
The new tool arrives as internet platforms have struggled to find and prevent sexually explicit images from spreading on their websites without the subject’s consent. Experts say the problem appeared to grow worse during the pandemic, as use of digital tools swelled.
A 2021 report by the Revenge Porn Helpline found that reports of intimate image abuse increased significantly over the prior five years with a 40 percent increase in reported cases between 2020 and 2021.
“Oftentimes a child doesn’t know that there’s an adult on the other end of this conversation,” National Center for Missing & Exploited Children spokesperson Gavin Portnoy said in an interview. “So they start demanding more images or more videos and often with the threat of leaking what they already have out to that child’s community, family [and] friends.”
Tech companies that find sexually explicit images of youth are required by law to report the user that posted the material but no such standard exists for adults. Dozens of states have passed statues designed to address nonconsensual pornographic imagery, but they are difficult to enforce because Section 230 of the Communications Decency Act offers tech companies legal immunity from user-generated content posted on their websites, said Megan Iorio, a senior counsel of the Electronic Privacy Information Center.
The interpretations “allow companies to not only ignore requests to remove harmful content, including defamatory information and revenge porn, but also to ignore injunctions requiring them to remove that information,” Iorio said.
While Take It Down is only open to children under 18 or their guardians, it follows a similar 2021 effort from Meta to help adults find and remove nonconsensual explicit content about themselves. Meta funded and built the technology for a platform called Stop Nonconsensual Intimate Image Abuse, which is run by the Revenge Porn Helpline. Users are allowed to submit a case to the helpline, which is run by U.K.-based tech policy nonprofit SWGfL. Then participating sites, including Facebook, Instagram, TikTok and Bumble, remove the content.
Meta tried to similar approach in 2017 where users could report suspicious images of themselves to prompt the company to search for them on their networks and stop them from being shared again. But the move prompted criticism from advocates who said the program could compromise users’ privacy.