Facebook has revealed long-held secrets on how it polices the social network and why it bans certain content.

For the first time, the social network has released a detailed 27-page internal document on how it moderates content according to its so-called community standards.

The playbook lays out rules for posting on subjects ranging from drug use, sex work, cyber bullying, hate speech, and inciting violence.

In the past, the rules governing what Facebook’s 2.2 billion users were allowed to post were kept under wraps, with only a shorter version of the community standards in the public domain.

The revelations are part of Facebook’s efforts be more open and transparent and to clarify its posting policies.

Monika Bickert, Facebook’s vice president of product policy and counter-terrorism, said:

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData

I’ve been wanting to do this for a long time.

You should, when you come to Facebook understand where we draw these lines and what’s OK and what’s not OK.

The new era of transparency comes after Facebook became embroiled in a scandal that saw 50 million user profiles shared with shadowy data analytics firm Cambridge Analytica — some of which are thought to have potentially influenced elections in the US and the UK.

The scandal brought the company’s regulations, particularly surrounding data-privacy, into the spotlight.

Facebook has also come under fire from rights groups and governments over the rules governing the platform, which has been used to promote hate speech, violence, sectarian tensions and to broadcast acts of murder and suicide, and for removing posts critical of repressive regimes.

Bickert said that Facebook currently uses an army of around 7,500 moderators to police the forum, along with automated software to spot text, pictures and videos that violate its rules.

The company has also recruited legions of moderators in the past year due to pressure from a number of governments in countries where Facebook is a key source of information.

Bickert said the new dossier is the same as the guidelines provided to moderators, except for those on terrorism, which will not be disclosed.

She said:

Everybody should expect that these will be updated frequently.

Last week Facebook allowed reporters to sit in on a meeting of its so-called content standards forum, where moderation policies are set.

According to the newly published 8,000 word document, these are some of the ways Facebook’s rules have changed:

  • For the first time, users will be able to appeal a decision to take down content involving nudity or sexual activity, hate speech and graphic violence. Before only the removal of accounts, Groups and Pages could be appealed
  • Cannibalism videos and still pictures are banned. Age restrictions will apply to cannibalism posted for medical reasons
  • Mutilation, mass murder, the sale of illicit of prescription drugs, marijuana or firearms on the social network are also banned
  • Admitting to using non-medical drugs on the public forum is also prohibited
  • Harassment, bullying and cursing at a minor are banned
  • Leaking information acquired from a hacked source is also not allowed  “except in limited cases of newsworthiness”

In cases where local governments request that content is removed for violating national law, these posts will be reviewed by Facebook moderated and blocked in that country, but not globally, Bickert said.