img

Meta touts reduction in policy enforcement 'mistakes,' says less than 0.1% of content published globally was removed in error

The company said that less than one out of every 1,000 pieces of content published was removed in error. 

ADVERTISEMENT

The company said that less than one out of every 1,000 pieces of content published was removed in error. 

Image
Hannah Nightingale Washington DC
ADVERTISEMENT
Meta has released its third-quarter Community Standards Enforcement Report, revealing that since January, the amount of content mistakenly removed from the platform has drastically decreased. In total, the company said there were upwards of 36.5 million "false positives," or content mistakenly flagged, on Facebook, and upwards of 16.2 million false positives on Instagram.

In January, the social media company announced steps that it would take to reduce Community Standards enforcement mistakes, including ending its third-party fact-checking program, moving to a Community Notes-style program like what X uses, and getting rid of "a number of restrictions on topics like immigration, gender identity, and gender that are the subject of frequent political discourse and debate." Former President Joe Biden called the ditching of fact checkers "shameful."

Meta said in a press release that in the third quarter globally, less than one percent of the hundreds of billions of pieces of content made were removed for violating policies, and less than 0.1 percent was removed incorrectly. Of the content removed, more than 90 percent was correctly removed from Facebook, and more than 87 percent was correctly removed from Instagram, meaning that around one in 10 pieces of content removed, and less than one out of every 1,000 pieces of content published, was removed in error.

"This has occurred as prevalence remained consistent across most problem areas, with a few exceptions: On both Facebook and Instagram, prevalence increased for adult nudity and sexual activity and for violent and graphic content, and on Facebook it increased for bullying and harassment. This is largely due to changes made during the quarter to improve reviewer training and enhance review workflows, which impacts how samples are labeled when measuring prevalence," the report stated.

Meta’s second quarter Community Standards Enforcement Report stated that since it began its efforts to "reduce over-enforcement, we’ve cut enforcement mistakes in the US by more than 75 percent on a weekly basis."

Meta was criticized by X owner Elon Musk in November for banning people who misgender a trans-identified person, "but trafficking child prostitution allowed 17 strikes!" In unsealed court documents, Instagram's former head of safety Vaishnavi Jayakumar testified, "You could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended. By any measure across the industry, [it was] a very, very high strike threshold."

A report released in December 2024 from Meta acknowledged that "harmless content" had been excessively banned from its social media sites during the election season. Nick Clegg, Meta President of Global Affairs, wrote: "We know that when enforcing our policies, our error rates are too high, which gets in the way of free expression we set out to enable. Too often harmless content gets taken down or restricted and too many people get penalized unfairly. We will continue to work on this in the months ahead."

The following month, Meta CEO Mark Zuckerberg told Joe Rogan that Facebook had faced "massive institutional pressure" to "start censoring content on ideological grounds" after Trump’s election and during the Covid pandemic. He said that things hit an "extreme" during the Biden administration as it was trying to roll out its vaccine program. "I think on balance, the vaccines are more positive than negative, but I think that while they're trying to push that program, they also tried to censor anyone who is basically arguing against it."

"And they pushed us super hard to take down things that were honestly, were true," he added. "They basically pushed us in and said, you know, anything that says that vaccines might have side effects, you basically need to take down."
ADVERTISEMENT
ADVERTISEMENT
Sign in to comment

Comments

Powered by The Post Millennial CMS™ Comments

Join and support independent free thinkers!

We’re independent and can’t be cancelled. The establishment media is increasingly dedicated to divisive cancel culture, corporate wokeism, and political correctness, all while covering up corruption from the corridors of power. The need for fact-based journalism and thoughtful analysis has never been greater. When you support The Post Millennial, you support freedom of the press at a time when it's under direct attack. Join the ranks of independent, free thinkers by supporting us today for as little as $1.

Support The Post Millennial

Remind me next month

To find out what personal data we collect and how we use it, please visit our Privacy Policy

ADVERTISEMENT
ADVERTISEMENT
By signing up you agree to our Terms of Use and Privacy Policy
ADVERTISEMENT
© 2025 The Post Millennial, Privacy Policy | Do Not Sell My Personal Information