Facebook responds after internal guidelines on sex and violence revealed

The leaked guidelines show what content is allowed on the social media platform

Facebook responds after internal guidelines on sex and violence revealed

The Facebook logo is displayed on an iPad | Image: Matt Rourke/AP/Press Association Images

Facebook has responded after internal guidelines for issues including sex, violence, terrorism and racism were revealed.

An investigation by The Guardian reviewed over 100 internal training manuals, spreadsheets and flowcharts pertaining to issues such as violence, hate speech, terrorism, pornography, racism and self-harm.

According to The Guardian, Facebook also has guidelines for issues such as match-fixing and cannibalism.

One leaked document said that Facebook reviews more than 6.5 million reports a week relating to potentially fake accounts called FNRP (fake, not real person).

Leaked guidelines

In Facebook's extensive guidelines, they outline what content should or should not be deleted from the site. For example, according to the leak, remarks such as "Someone shoot Trump" should be deleted, because as a head of state he is in a protected category.

However, it is acceptable to say "To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat", or "fuck off and die" because they are not regarded as credible threats

Elsewhere in the guidelines, videos of violent deaths, even though they may be marked as disturbing, do not always have to be deleted because they can help create awareness of issues such as mental illness.

Some photos of non-sexual physical abuse and bullying of children do not have to be deleted or "actioned" unless there is a sadistic or celebratory elements.

Meanwhile, photos of animal abuse can be shared, with only extremely upsetting imagery to be marked as "disturbing".

All "handmade" art showing nudity and sexual activity is allowed but digitally made art showing sexual activity is not. Also videos of abortions are allowed, as long as there is no nudity.

 Facebook's response

Monika Bickert, head of Global Policy Management at Facebook, says: "Keeping people on Facebook safe is the most important thing we do.

"We work hard to make Facebook as safe as possible while enabling free speech.

"This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.

"(Facebook CEO) Mark Zuckerberg recently announced that over the next year, we'll be adding 3,000 people to our community operations team around the world on top of the 4,500 we have today to review the millions of reports we get every week, and improve the process for doing it quickly.

"In addition to investing in more people, we're also building better tools to keep our community safe.

"We're going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help."

File photo

 "Terror of war"

Facebook will also allow people to livestream attempts to self-harm because it "doesn’t want to censor or punish people in distress."

According to Facebook, anyone with more than 100,000 followers on a social media platform is designated as a public figure, which denies them the full protections given to private individuals.

In one of the leaked documents, Facebook acknowledges "people use violent language to express frustration online" and feel "safe to do so" on the site.

It also allows for "newsworthy exceptions" under its "terror of war" guidelines but draws the line at images of "child nudity in the context of the Holocaust".

Facebook also said it is using software to intercept some graphic content before it gets on the site, but that "we want people to be able to discuss global and current events … so the context in which a violent image is shared sometimes matters."

Additional reporting: Kenneth Fox