For years, Facebook has had what they call “community standards,” to explain what was and wasn’t postable on the site. However, the explanation of those standards was brief and general- at best- for the public and far more detailed internally. In the last couple years, the company has faced fierce criticism for failing to do enough to stem hate speech or even prevent the service from being used to promote terrorism. And let’s not forget that actual murders and suicides have been posted, in some cases live. It appears now that there have also been times when Facebook has helped repressive regimes by aggressively removing content they didn’t like.

RELATED STORY:

And so, perhaps to calm the storm of angry voices currently surrounding the media giant, and perhaps seem more above board, on Tuesday Facebook released their “rule book for the types of posts it allows on its social network, giving far more detail than ever before on what is permitted on subjects ranging from drug use and sex work to bullying, hate speech and inciting violence.”1

If you’ve got a business that runs in part on their platform, take note.

RELATED STORY:

 

Monika Bickert, Facebook’s vice president of product policy and counter-terrorism said, “You should, when you come to Facebook, understand where we draw these lines and what’s OK and what’s not OK.”2She also went on to say that standards are constantly evolving, “based in part on feedback from more than 100 outside organizations and experts in areas such as counter-terrorism and child exploitation.”3

 

“The company considers changes to its content policy every two weeks at a meeting called the “Content Standards Forum,” led by Bickert.

At the April 17 meeting, about 25 employees sat around a conference table while others joined by video from New York, Dublin, Mexico City, Washington and elsewhere.”4

(One new policy allows people to appeal a decision to take down an individual piece of content (previously you could only do that for accounts, Groups, and Pages).)

RELATED STORY:

 

What is missing in the new community standards are separate procedures for governments who want content removed. In that instance, “formal written requests are required and are reviewed by Facebook’s legal team and outside attorneys. Content deemed to be permissible under community standards but in violation of local law – such as a prohibition in Thailand on disparaging the royal family – are then blocked in that country, but not globally.”5

Facebook is the world’s largest social network and it has become a dominant source of information in countries all over the world. Currently, it takes both automated software and about 7,500 human moderators, to take down text, pictures, and videos that violate its rules. In May and June, the company will hold a series of public forums in different countries to get more feedback on its rules.

Sources and References

  1. Reuters, April 24, 2018.
  2. Reuters, April 24, 2018.
  3. Reuters, April 24, 2018.
  4. Reuters, April 24, 2018.
  5. Reuters, April 24, 2018.