A new investigative report from the Wall Street Journal has revealed that Instagram is helping to connect and promote networks of accounts that commission and purchase underage sex content.
The report, which came from investigations done by the outlet and researchers at Stanford University and the University of Massachusetts Amherst, revealed that Instagram’s algorithm actively promotes pedophile accounts, and guides pedophiles to content sellers through the recommendation systems "that excel in linking those who share niche interests."
The social media platform has allowed people to search for explicit hashtags such as "pedowhore" and "preteensex," and connected those searchers to accounts that advertise the sales of child sex materials, accounts that are often claiming to be run by children and use handles such as "little slut for you."
Those accounts offering to sell pedophile sex materials post "menus" of content, with some accounts inviting buyers to commission specific sexual acts.
Some of these menus include pricing for videos of children hurting themselves and "imagery of the minor performing sexual acts with animals," Stanford Internet Observatory researchers found.
Content menus also include in-person "meet ups" with children.
Federal law and Meta content rules both prohibit the promotion of underage sex content. Meta told the outlet that it has set up an internal task force to address the issue, and acknowledged that there are problems within its enforcement operations.
"Child exploitation is a horrific crime," the company said, adding, "We’re continuously investigating ways to actively defend against this behavior."
Meta said that it has taken down 27 pedophile networks in the past two years and is planning more removals. Since being contacted by the Wall Street Journal, Meta said it has blocked thousands of hashtags, some with millions of posts, that sexualize children, and has restricted its algorithm from recommending to users search terms associated with sex abuse.
Test accounts set up by researchers at Stanford’s Internet Observatory and UMass’ Rescue Lab that viewed a single account in the network of pedophiles were instantly given "suggested for you" recommendations of child sex content buyers and sellers and accounts linking to content trading sites off-platform.
"Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children," the Wall Street Journal wrote.
Stanford researchers discovered 405 sellers of "self-generated" child sex content. These accounts claimed to be run by children, some as young as 12. 112 of those accounts had a combined 22,000 unique followers.
Other accounts in Instagram’s pedophile community aggregate pro-pedophilia memes, or discuss their access to children.
Current and former Meta employees that worked on Instagram child-safety initiatives told the outlet that the estimated number of accounts that primarily follow such content is in the hundreds of thousands, if not millions.
In 2022, the National Center for Missing & Exploited Children received 31.9 million reports of child pornography, up 47 percent from two years earlier. Most of the reports came from internet companies.
Meta, whose apps include Instagram, Facebook, and WhatsApp, accounted for 85 percent of the child pornography reports filed to the center, including around 5 million from Instagram alone.
This is a breaking story and will be updated.
Join and support independent free thinkers!
We’re independent and can’t be cancelled. The establishment media is increasingly dedicated to divisive cancel culture, corporate wokeism, and political correctness, all while covering up corruption from the corridors of power. The need for fact-based journalism and thoughtful analysis has never been greater. When you support The Post Millennial, you support freedom of the press at a time when it's under direct attack. Join the ranks of independent, free thinkers by supporting us today for as little as $1.
Remind me next month