|

When it comes to moderation, Facebook’s fairness and PR goals are in conflict

Facebook claims to have a consistent set of standards for what appears on its platform. But based on the reporting in a recent Wall Street Journal article, it fails.

It attempts and fails to apply fair standards to ordinary people.

It attempts and fails to treat prominent people who post offensive material more personally and carefully.

It tries and fails to avoid “PR fires” based on what it allows on its platforms.

The result is inconsistent, incompetent, and infuriating. And the problem may be unsolvable.

The Wall Street Journal showed how privileged users get away with breaking the rules

The Jeff Horwitz article, titled “Facebook Says Its Rules Apply to All. Company Documents Reveal a Secret Elite That’s Exempt,” was published in Monday’s Wall Street Journal. It reports:

  • A group of users who are prominent figures in entertainment, sports, or politics — or are otherwise considered “influential” — operate under a completely different set of moderation rules, known as “XCheck.” These users are “whitelisted,” receiving more lenient treatment, and they are more likely to receive a review by a skilled human moderator.
  • Amazingly, XCheck has grown to 5.8 million users — and it too large for Facebook to administer in any consistent way. At one point, anyone at Facebook could add an individual to the privileged set of users.
  • Facebook told its oversight board that that its system for high-profile users was used in “a small number of decisions.” Apparently, millions of users are “a small number.”
  • According to the article, “Facebook designed the system to minimize what its employees have described in the documents as ‘PR fires’—negative media attention that comes from botched enforcement actions taken against VIPs.” In other words, the objective is not fairness, but avoiding negative news coverage.
  • At Facebook, 45 teams are tasked with moderating this special content.
  • The system has had some spectacular misses, such as allowing the soccer star Neymar to post on Instagram nudes of a woman who was suing him for rape — in other words, revenge porn. While the nudes were eventually taken down, Neymar’s account remains active, even though any ordinary individual who did this would be banned.
  • Mark Zuckerberg admits that it gets 10% of the decisions in its ordinary moderation system wrong — a conclusion reinforced by the many users I know who got banned for innocuous posts. For example, the article describes one user who got censured for complaining that “white paint colors are the worst.”
  • Politics trumps fairness here. One research scientist wrote in a 2020 memo that “Facebook currently has no firewall to insulate content-related decisions from external pressures,” citing interference by PR and senior executives in decisions. Another wrote that “Facebook routinely makes exceptions for powerful actors.”

The unsolvable problem

Military strategists describe a problem in modern engagements: asymmetric warfare. Well-equipped troops cannot safely take on thousands of random actors attempting every possible method to create chaos.

Facebook is engaged in asymmetric warfare with its user base.

If the goal of moderation is the application of a consistent set of rules, it fails. Because of the creativity and infinite variation of its users posting content, its algorithms are marred by both false positives (“paint colors”) and false negatives. Its outsourced human moderators become depressed or suicidal from looking at filth all day.

If the goal of moderation is to avoid PR blowups — which it shouldn’t be — then it is also doomed to fail. Remember the Streisand effect. Avoiding PR disasters assumes that the judgment of PR staff and senior executives is accurate. But as any PR professional will tell you, it’s hard to know which decisions will create more of a problem — leaving up questionable content, or taking it down. And the act of attempting to apply judgment based on what looks bad will make whatever you do more public, and more visible, exacerbating the PR problem.

If the goal of Facebook is to allow all users access to the same publishing options that once only famous people and journalists had, this two-tier moderation system undermines that goal. Facebook has special rules if you’re rich or famous, just like everything else in modern society.

Rather than attempting to place a Band-Aid over the misbehavior of famous people, Facebook should abolish XCheck and subject everyone to the same moderation standards. By revealing how bad, inconsistent, and ineffective its moderation standards are, that will make Facebook look even worse than it already does.

But it might be the only way to force the company to put a lot more resources into fair and accurate moderation for the rest of us.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.