Facebook’s “Supreme Court” to Rule on Favoritism of Elite Users
Should Facebook have different rules for high-profile users than it does for the masses? Facebook’s Oversight Board said this week that it will review the company’s use of more relaxed content rules for high-profile users such as athletes and politicians. Under review is the companies “Cross Check” program for the better-known users.
is the Oversight Board
Mark Zuckerberg, the founder and CEO of Facebook appointed an Oversight Board in 2019. It’s a worldwide 20-member body, which includes human rights activists, lawyers, journalists, professors, and others to serve as an entity to hear appeals after Facebook makes decisions on content moderation. It is sometimes referred to as “Facebook’s Supreme Court” as it can overturn decisions made by company executives – it has the final word.
Since its creation, the Oversight Board has received more than 500,000 requests from Facebook users to examine its content moderation decisions. It has issued 15 decisions in what it calls “important cases,” with 11 of those decisions overturning Facebook.
What’s at Stake in this Review
The review comes after a series of investigative reports by The Wall Street Journal that included a description of how the social media platform Facebook has applied one set of content moderation rules to regular users, but allowed high-profile users significantly more latitude with material that potentially violates Facebook’s content guidelines. It is to include how politicians, celebrities, journalists, etc. may be given different treatment or allowed another standard of conduct.
The Wall Street Journal report showed the “cross-check” program effectively shields millions of accounts from the enforcement actions that more common Facebook users are held to. The program has allowed “whitelisted” accounts to post false claims that would not otherwise be tolerated. It also has allowed harassment, and other issues, according to the Journal.
Last year there were almost six million accounts in the system getting special treatment by Facebook employees. For the rest of Facebook’s 2.8 billion monthly visitors, violating the company’s guidelines against issues like bullying or hate speech is handled by automated systems. The Oversight Board said this week that Facebook must be more transparent in how rules are applied.
What Happens Next?
The Board said it expects to be briefed by Facebook within the next few days. The findings and any action will be published in October as part of its quarterly transparency report. The Board also said it has been examining the cross-check system “for some time.” In the past, it warned that a lack of transparency “could contribute to perceptions that Facebook is unduly influenced by political and commercial considerations.”
The Oversight Board is a Facebook-funded body that operates outside of the company’s formal corporate chain of command, it arbitrates decisions about content moderation. It’s in place as a separate entity outside of Facebook that has the final say in decision-making on moderation. It is currently looking at a practice Facebook uses, which uses digital electronic moderating for some and a far less stringent moderation system for others.
As with most businesses, reputation is important. The Board created less than two years by Mark Zuckerberg is to help protect the reputation of the company. The outcome of this review will be known in October.
Advertising Rules Becoming a Guessing Game on Some Social Media
Your Data is Being Used to Generate Big Returns
Seeking Alpha Paywall Causes Frustration
Digital Media and Entertainment Industry Outlook
Stay up to date. Follow us: