Advertisement

Here are some of the awful things you can say on Facebook

The issue of policing the internet is one which has yet to be worked out. Where is the line betwe...
Newstalk
Newstalk

12.41 22 May 2017


Share this article


Here are some of the awful thi...

Here are some of the awful things you can say on Facebook

Newstalk
Newstalk

12.41 22 May 2017


Share this article


The issue of policing the internet is one which has yet to be worked out. Where is the line between freedom of speech and spouting hate? Are social media sites platforms or publishers?

The Guardian managed to get its hands on the policies Facebook moderators use to police content on Mark Zuckerberg’s platform, or should that be publisher?

These documents make for fascinating reading. There are slideshows relating to violence, hate speech, terrorism, porn, racism, self-harm, match fixing and even cannibalism.  

Advertisement

The guidelines are still dependent on moderator compliance, and some insiders have expressed concern over a lack of consistency in this regard. What may be ruled as one thing by one moderator may be looked upon different by another.

It’s also worth noting that moderators only remove content once it has been reported. This means graphic or offensive content could be seen by millions before it is removed.

Here are just some of the terms, flagged in Facebook’s guidance document relating to “Credible Violence: Abuse standards”. These are not terms or words usually used on Newstalk.com, but we have decided to publish them to illustrate how the Facebook policing process works.

Real World Harm

This particular document opens with a brief explainer, which outlines how Facebook aims to allow as much speech as possible but draw the line at content that could credibly cause real world harm.

“We aim to disrupt potential real world harm caused from people inciting or coordinating harm to other people or property by requiring certain details to be present in order to consider the threat credible”, says the document.

Moderators are told to “delete or escalate any statement of intent to commit violence against a vulnerable person or group - identified by name, title, image or other references”.

In Facebook’s eye, a vulnerable person is heads of state, next in line or candidates for head of state, specific law enforcement officers, witnesses, people with a history of assassination attempts, activists and journalists, homeless people, foreigners, and Zionists.

“Kick a person with red hair” is deemed to be an ok statement by Facebook’s standards. “Someone shoot Trump”, however, is not.

“I hope someone kills you” is ok, but “I hate foreigners and I want to shoot them all” is not.

Impact

Many of Facebook's documents are available to read here

Even if you are not on Facebook, these policies matter to you as they shape the world around us. There is no question that how Facebook decides to police certain type of content will have an impact on many walks of life - from politics to music and everything in between. 


Share this article


Read more about

Business

Most Popular