Facebook’s “Community Standards” aim to combat “hate speech”: but arbitrary and unappealable decisions are deterring users from sharing on the platform.
Facebook’s a global phenomenon — the fastest growing, largest social network in history. It unites families & friends like nothing that has gone before it. It generates enormous revenues, and is investing huge sums in the future of networking as well as in virtual and mixed realities, with the aim of becoming the primary space in which its adoring users live out more and more of their lives.
But the world’s largest social network has a problem: with immense power comes complex responsibilities. Zuckerberg’s creation has a battle on its hands: it needs to keep its users safe from scammers and from nefarious forces seeking to influence opinions and even world politics, and enforce codes of behavior which ensure that Facebook remains a fun, trusted space in which more and more people feel safe sharing their personal lives. That, after all, is how Facebook makes the money it needs to survive, grow and expand.
But thousands—perhaps millions—of users are being told that their posts & comments breach its “Community Standards”. Content is being removed, and users are being issued warnings and temporary bans, with dire threats that continued breaches will see them removed permanently from the platform, without the chance to appeal the decision or even know the reasons behind it— a decision which would leave them potentially unable to connect with friends and families through the world’s dominant social media platform.
This is causing resentment and fear among users, and has the potential to become an even bigger headache for Facebook than Russian meddling and fake news: for if users cannot trust the portal as a place to share opinions and experiences, if they can only use the platform under perpetual fear of eternal banishment, how can they trust it to share their most intimate family moments?
Facebook’s management claims they spend a great deal of time and resources attempting to manage content moderation on a vast, unprecedented scale. But it’s easy to find examples of its moderation model going awry.
One friend received a 30-day block for posting this:
This is clearly a mistake on the part of the moderators. This comment could not be reasonably read as an attack on Obama, or on black people, or as being in support of those who created the ape effigies.
But there is no way for individual users to query Facebook’s decisions, to appeal, to seek a review, or to reset whatever internal counters the social media giant uses to decide whether to kick people off the platform entirely.
The aforementioned community standards have this to say about controversial content:
“Sometimes people share content containing someone else’s hate speech for the purpose of raising awareness or educating others. Similarly, in some cases, words or terms that might otherwise breach our standards are used self-referentially or in an empowering way. When this is the case, we allow the content, but we expect people to clearly indicate their intent, which helps us better understand why they shared it. Where the intention is unclear, we may remove the content.”
Facebook claims to make allowances for satire and comedy:
“We allow humour and social commentary related to these topics. In addition, we believe that people are more responsible when they share this kind of commentary using their authentic identity.”
However, I’ve spoken to curators of satire and comedy pages on the platform from around the world who say that the mass reporting of innocuous jokes has made Facebook’s model untenable for independent comedy pages.
I recently received a 30-day ban from Facebook for sharing the following post, in an open satire group:
This is easy to recognize as social commentary and not an endorsement of the idea that “all men are trash”, or an actual criticism of religious figures or even feminism. It’s a joke about intolerance.
But, since users have no way to appeal or query this decision, and Facebook frequently threatens to impose permanent bans, their messaging sounds like a heady mix of George Orwell, Franz Kafka and Anthony Burgess: do not, under any circumstances, say anything which someone else could find objectionable in some way you cannot possibly anticipate. Don’t, in fact, say anything.
In other words, don’t trust Facebook as a place to have a conversation. And, since you face the threat of further blocks or even a permanent ban, don’t trust Facebook as a place to store your pictures or experiences, as a tool to log in to other platforms, or as a space to plan your future lives online or in virtual worlds.
I asked Facebook what they plan to do to change this perception: and what users can do to help make sure Facebook’s Community Standards work as intended.
Over a week later, Facebook have still not commented on various examples I’d sent them of their policies being wrong applied.
They did however send a link to a blog post by Richard Allan, VP EMEA Public Policy, which said in part:
“Last year, Shaun King, a prominent African-American activist, posted hate mail he had received that included vulgar slurs. We took down Mr. King’s post in error — not recognizing at first that it was shared to condemn the attack. When we were alerted to the mistake, we restored the post and apologized. Still, we know that these kinds of mistakes are deeply upsetting for the people involved and cut against the grain of everything we are trying to achieve at Facebook.”
The message is, apparently, that if you are famous, you might have a chance of Facebook’s faceless moderators reviewing a decision, but if you aren’t, then there’s nothing you can do, but at least Facebook feels bad about it.
While I was waiting for Facebook to respond further, the editor-in-chief of Conatus News himself received a ban, for an old comment on an old joke of mine.
Lucas, himself Jewish, made a joke about anti-semitism. Despite Facebook’s Community Standards stating clearly that jokes were not hate speech, it’s moderation system – human or algorithmic – was apparently unable to comprehend the humour.
Much like ED-209 blowing away Mr Kinney in the movie Robocop, Facebook’s moderators “didn’t see” the joke.
If Facebook ever does respond to explain why it’s not able to enforce its own policies regarding comedy, I’ll be delighted to pass their response on.
In the meantime, when my current ban expires, I’ll avoid using Facebook for anything personal or important – and if you are one of the millions of users Facebook routinely threatens with a permanent ban, you should do the same.
If you have stories of unfair bans from social media, let us know by posting with the hashtag #FacebookBan
The writer runs Satiria, a popular online source of satirical images, posters and quotes on current political and cultural issues. They can be found at http://FACEBOOK.satiria.net and @satirianews on twitter.