Use code TPM for up to 66% off at MyPillow.com

ADVERTISEMENT

Private TikTok accounts harbor child exploitation content on platform

The illegal child sexual exploitation goes on in "post-in-private" accounts, which are explicitly forbidden by TikTok’s security policies but nevertheless exist in abundant quantity. 

ADVERTISEMENT
Image
Mia Ashton Montreal QC
ADVERTISEMENT
ADVERTISEMENT

An investigation by Forbes has revealed that TikTok has a rampant child sexual exploitation issue, with the platform allowing predators to groom minors and share illegal images through private accounts despite the popular social media app claiming to have a "zero tolerance" policy for child sexual abuse material.

The illegal child sexual exploitation goes on in "post-in-private" accounts, which are explicitly forbidden by TikTok’s security policies but nevertheless exist in abundant quantity. 

"There’s quite literally accounts that are full of child abuse and exploitation material on their platform," child sexual abuse survivor and children’s safety advocate Seara Adair told Forbes. "Not only does it happen on their platform, but quite often it leads to other platforms - where it becomes even more dangerous."

Adair’s first encounter with the "posting-in-private" type of account occurred in March when someone made public a video that had been shared into the private TikTok account @My.Privvs.R.Open. The video was of a pre-teen "completely naked and doing inappropriate things." Adair immediately reported the video, but TikTok responded that they "didn’t find any violations."

In a video posted the next day, Adair explained that the video had been made public accidentally by a member of the closed group for predators, and had since been set to private again rather than deleted.

In an email to Forbes, a TikTok spokesperson claimed the social media platform has zero tolerance for such material and "abhorrent behavior," and stated that whenever they become aware of such content, they "remove it, ban accounts, and make reports" to the National Center for Missing and Exploited Children. TikTok also claimed that all videos are subject to the platform’s AI content moderation.

The investigation revealed how easy it is for predators to get around TikTok’s guidelines. Instead of using the phrase "posting in private," which would be flagged as a violation, "algospeak," or deliberately typing with typos like "postinprvts" or "pos.t.i.privs," would be used to avoid triggering the algorithm, and Adair believes users are posting a black screen for a few seconds at the beginning of the illegal videos to trick the AI.

A Forbes investigator was able to log in to several post-in-private handles with little effort, with some requiring pledges to contribute images as part of a lax vetting process. Some of these accounts were recruiting girls at 13+.

In one video, a young girl slowly removes her school uniform until she is completely naked. In others, young girls remove their shirts and bras and fondle their breasts. This is despite TikTok prohibiting "content that depicts or implies minor sexual activities" and "content depicting a minor undressing."

There are invitations to move to other social media platforms such as Discord, even though again, TikTok prohibits content that "directs users off platform to obtain or distribute CSAM."

"There is this trend of either closed spaces or semi-closed spaces that become easy avenues for networking of child abusers, people wanting to trade child sexual abuse materials," Haley McNamara, director of the International Centre on Sexual Exploitation told Forbes. "Those kinds of spaces have also historically been used for grooming and even selling or advertising people for sex trafficking."

According to Dr. Jennifer King, the privacy and data policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence, TikTok could detect these violations by using two-factor authentification.

"It's a red flag," King told Forbes. "You can absolutely know this is happening."

McNamara added that while TikTok may have safety policies in place to protect minors from exploitation, "what happens in practice is the real test." One that it would appear the social media platform used daily by more than half of minors in the US is failing miserably.

Adair said she has attempted to contact TikTok employees about the discoveries she has made, but told Forbes she has never heard back from a single one. She does hear from young girls who have been groomed by predators on the social media platform though, and noted that almost every single one “has not told their parents what has happened.”

ADVERTISEMENT
ADVERTISEMENT
TikTok splash screen on iOS. by Solen Feyissa is licensed under unsplash.com

Join and support independent free thinkers!

We’re independent and can’t be cancelled. The establishment media is increasingly dedicated to divisive cancel culture, corporate wokeism, and political correctness, all while covering up corruption from the corridors of power. The need for fact-based journalism and thoughtful analysis has never been greater. When you support The Post Millennial, you support freedom of the press at a time when it's under direct attack. Join the ranks of independent, free thinkers by supporting us today for as little as $1.

Support The Post Millennial

Remind me in September

We will use this to send you a single email in September 2020. To find out what personal data we collect and how we use it, please visit our Privacy Policy

ADVERTISEMENT
ADVERTISEMENT
By signing up you agree to our Terms of Use and Privacy Policy
ADVERTISEMENT
© 2022 The Post Millennial, Privacy Policy