The suit, which was filed by the teenaged victim and his mother in the Northern District of California on Wednesday, alleges that not only did Twitter leave the posts up, but also made money from them, reported the New York Post. The video clips that Twitter allegedly reviewed and deemed acceptable showed the then-13-year-old victim engaged in sex acts.
The victim, who is only identified as John Doe, is now 17. He alleged in the suit that he was groomed and blackmailed by sex traffickers posing as a 16-year-old female classmate on Snapchat. The traffickers convinced the young boy to exchange nude photos, and then threatened to share those with people like his "parents, coach, pastor" if he didn't share more.
The traffickers allegedly told him to record sex acts with another child. With the threat of having sexually explicit images of him leaked, Doe complied, the suit claims.
Doe ended up blocking the predators later on, but the videos surfaced online in 2019 when two Twitter accounts that had already shared child sex abuse material posted some of the child porn clips, the victim said in the lawsuit.
The suit claims that the videos were reported to Twitter at least three times over the next month to no avail.
Court records show that Doe was subjected to "teasing, harassment, vicious bullying" which led him to become "suicidal," once the videos were discovered by his classmates in January 2020, which was how he found out they'd been posted.
Doe's parents tried to help the situation by contacting school officials and making police reports, while their son reported the posts to Twitter.
A Twitter support agent requested a copy of the boy's ID so his identity could be verified. After the teen complied, the family claimed they were left in the dark for about a week.
Around this time, Doe's mother also made two complaints to Twitter on the same posts. She also received no response for a week, they claim.
On Jan. 28, a support agent finally replied to Doe saying that no action would be taken to remove the material. At that point, the videos showing Doe and the other underage victim performing sex acts had been viewed over 167,000 times and retweeted about 2,223 times, the suit states.
"Thanks for reaching out. We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time," Doe claims Twitter Support told him.
Doe's reply was included in the complaint.
"What do you mean you don't see a problem? We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down," he wrote, including the case number he had from a local law enforcement agency.
Two days later, progress was finally made when the victim's mother connected with a Department of Homeland Security agent through a mutual contact. That agent was able to get the videos removed on Jan. 30, according to the suit.
"Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children," says the lawsuit. CSAM is short for "child sexual abuse material."
"This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children."
The suit went on to accuse Twitter of knowingly profiting from hosting CSAM on their platform by advertising between the criminal tweets.
In a statement to the New York Post, a Twitter spokesperson failed to address Doe's specific complaint, but claimed that their teams "work to stay ahead of bad-faith actors."
"Twitter has zero tolerance for any material that features or promotes child sexual exploitation. We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy," the spokesperson said.
"Our dedicated teams work to stay ahead of bad-faith actors and to ensure we’re doing everything we can to remove content, facilitate investigations, and protect minors from harm — both on and offline."
Now that Elon Musk is in control of Twitter, anti-sex trafficking activists are beginning to feel hopeful about the prospect of the platform stamping out CSAM.
"It brings a tear to my eye," human trafficking survivor advocate Eliza Bleu said on Thursday after Musk announced that more CSAM had been removed from Twitter in the last month than in the last decade.
After Musk dropped the "Twitter Files," internal documents showing some of the distasteful inner workings of Twitter, Bleu gave credit to the new owner for the transparency and the fight against child pornography.
"Twitter not only dropped The first Twitter Files yesterday…," she tweeted.
"In the last 24 hours they removed 44k child sexual exploitation material profiles," she said.
Join and support independent free thinkers!
We’re independent and can’t be cancelled. The establishment media is increasingly dedicated to divisive cancel culture, corporate wokeism, and political correctness, all while covering up corruption from the corridors of power. The need for fact-based journalism and thoughtful analysis has never been greater. When you support The Post Millennial, you support freedom of the press at a time when it's under direct attack. Join the ranks of independent, free thinkers by supporting us today for as little as $1.
Remind me next month
To find out what personal data we collect and how we use it, please visit our Privacy Policy