If big tech continues censoring conservatives, that means our days on these platforms may be numbered. Please take a minute to sign up to our mailing list so we can stay in touch with you, our community. Subscribe Now!
Facebook has released new guidelines into how users whose content is removed or blocked from their site can obtain a resolution, by appealing to the Oversight Board. The Oversight Board is, in essence, an editorial board, meaning that Facebook has crossed over from platform, hosting content, to a publisher, editing and approving content.
The Oversight Board posted their announcement on Twitter. They said that starting Thursday, the Oversight Board will be available for Facebook and Instagram users to request independent review of content decisions.
Facebook's editorial decision making body said that "All decisions on cases taken by the Board - what content Facebook and Instagram should allow or remove - will be binding on the company. We will also provide policy recommendations, based on respect for freedom of expression and human rights."
This means that once a decision has been made by the Oversight Board, that's the final decision, and the company will stand by it. The method for appealing a fact check or content removal up to this point—a process this publication has traversed several times, getting fact checks reversed on each occasion—has been somewhat opaque.
Once a fact-check is issued by a third-party entity, a user's content is suppressed down to 20 percent of normal traffic. The user then appeals directly to the fact-checking entity, and days of back-and-forth may ensue wherein the user is at the mercy of the entity. Often, the entity will require that a post be changed, removed, or thoroughly edited up to the standards of the third party entity.
Facebook employed the services of third-party entities specifically so that they didn't have to take on the editorial and decision making authority themselves. Now, they will be taking on editorial and decision making authority.
The Oversight Board wrote that "Over the coming weeks we will be sharing details on the first cases that the Board is considering, and also opening a process for third parties to share insights and perspectives that may be helpful to the case review process."
In creating an Oversight Board, that is not third party but connected to the social media giant itself, whose decisions are binding across the site, Facebook is rebranding itself as a publisher and arbiter of content, as opposed to an open platform for freedom of expression by all users.
On their site, the Oversight Board explains how cases will come to them. While the previous method was that users would report content, and fact checkers would scan for violations, the new program has Facebook itself searching out speech it deems inappropriate for their site.
Content can be flagged by users, most likely including the third-party entities, or by Facebook itself. This is the biggest change—Facebook will be policing users on their site to determine if their speech is allowable on the site or not.