My Blog
Technology

Child porn on Instagram prompts Meta to create task force to investigate

Child porn on Instagram prompts Meta to create task force to investigate
Child porn on Instagram prompts Meta to create task force to investigate


Meta has started a task force to investigate how its photo-sharing app Instagram facilitates the spread and sale of child sexual abuse material.

The new effort by the Facebook parent company follows a report from the Stanford Internet Observatory which found that large networks of accounts that appeared to be operated by minors openly advertising self-generated child sexual abuse material for sale.

Buyers and sellers of self-generated child sexual abuse material connected through Instagram’s direct messaging feature, and Instagram’s recommendation algorithms made the advertisements of the illicit material more effective, the researchers found.

Meta layoffs fuels fear about spread of misinformation

“Due to the widespread use of hashtags, relatively long life of seller accounts and, especially, the effective recommendation algorithm, Instagram serves as the key discovery mechanism for this specific community of buyers and sellers,” the researchers wrote.

The findings offer more insight on how internet companies have struggled for years to find and prevent sexually explicit images that violates its rules from spreading on its social network. Experts have highlighted how intimate image abuse or so-called revenge porn rose sharply during the pandemic, prompting tech companies, porn sites and civil society to bolster their moderation tools. In April, the Guardian said its two-year investigation found that Facebook and Instagram had become major platforms for buying and selling children for sex.

The impact of Instagram on children and teens has faced scrutiny from civil society groups and regulators concerned about predators on the platform, privacy and the mental health impacts of the social media network. The company paused its controversial plans in September 2021 to build a separate version of Instagram specifically tailored for children who are under 13. Later that year, lawmakers also grilled the head of Instagram, Adam Mosseri, over revelations surfaced in documents shared with regulators by Meta whistleblower Frances Haugen showing Instagram is harmful to a significant portion of young users, especially teen girls.

The Stanford researchers said the overall size of the seller network ranges between 500 and 1,000 accounts at a given time. They said they started their investigation following a tip from the Wall Street Journal, which first reported on the findings.

Meta joins porn sites in backing new tool to fight revenge porn

Meta said it has strict policies and technology to prevent predators from finding and interacting with teens. In addition to the task force, the company said it had dismantled 27 abusive networks between 2020 and 2022, and in January disabled more than 490,000 accounts for violating its child safety policies.

“Child exploitation is a horrific crime,” Meta spokesman Andy Stone said in a statement. “We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it.”

While Instagram is a central player in facilitating the spread and sale of child sexualized imagery, other tech platforms also played a role, the report found. For instance, it found that accounts promoting self-generated child sexual abuse material were also heavily prevalent on Twitter, although the platform appears to be taking them down more aggressively.

Some of the Instagram accounts also advertised links to groups on Telegram and Discord, some of which appeared to be managed by individual sellers, the report found.

Related posts

Tesla and Musk Lose Ruling on Factory Union Issues

newsconquest

She Brought Her Younger Self Back to Life as an AI Chatbot

newsconquest

‘Jiggle Jiggle’ Defined: The whole thing to Know About Louis Theroux’s Rap

newsconquest