Facebook has had community standards for awhile now, but it hasn't always been clear on what is or isn't allowed on its platform. And sometimes, this has led to some serious confusion. Last year, ProPublica unearthed a Facebook internal training document that appears to prioritize "white men" over "black children," and later discovered that community moderators were often wildly inconsistent on what they consider hate speech. It's since apologized for these errors, and today, Facebook is hoping to clear up its reputation even further: It's publishing its internal community enforcement guidelines for the very first time.
To be clear, the community standards themselves have not changed. Instead, what Facebook is doing is updating them with more detail as to how they're enforced. According to Facebook, the guidelines that are published today are exactly the same as the ones used by the company's 7,500 or so moderators around the world. It seems to be all a part of the company's renewed effort to be more transparent with its users.
"We want to give people clarity," said Monica Bickert, Facebook's VP of Global Policy Management. "We think people should know exactly how we apply these policies. If they have content removed for hate speech, they should be able to look at their speech and figure out why it fell under that definition."
"The other reason we're publishing this is to get feedback on these policies," she continued. "[Getting] real world examples or examples on how an issue manifests itself in the community is helpful."
To go along with this announcement, Facebook is also expanding its appeals process. Until now, if you've had a specific post or photo removed for violating community guidelines, you didn't have the option to appeal that decision. But now, you do. You'll be given the option to "Request Review," and Facebook's Community Operations team will look at the request within 24 hours. If a mistake has indeed been made, Facebook promises to restore the post or photo.
"We're going to offer appeals for posts and photos not only if we remove the post and photo, but also if you report a photo and post and we don't remove it," said Bickert. "You'll have the opportunity to say hey, 'Take another look at this.'"
The community standards document is a fairly lengthy one, but it essentially covers six distinct categories: Violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property and content-related requests. Some of the guidelines seem fairly straightforward; for example, a threat of credible violence could result in a takedown, and even a report to the authorities. Masquerading as someone else is clearly defined as a wrong, as are child nudity and trafficking in illegal goods.
But others, like the ones around hate speech, are a lot more nuanced. Facebook states that it doesn't allow hate speech on its platform, and defines it as a "direct attack on people based on what we call protected characteristics -- race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity and serious disability or disease." It also offers some protection for immigrants, and defines an attack as "violent or dehumanizing speech, statements of inferiority, or calls for exclusion or segregation." Examples include violent speech (which would be a Tier 1 offense), expressions of disgust (a Tier 2 offense) or a call to exclude (a Tier 3 offense).
That might sound pretty clear, but the reality is that it's a lot more granular than that. For example, Facebook says that while you can't attack a person, you can still criticize an organization, a country, and even a religion. So for example, you can't say "Scientologists are evil," but you can say "Scientology is evil." If that sounds a little like splitting hairs, well, even Facebook would agree that it sometimes runs into these tricky definitions.
Additionally, while Facebook's hate speech policies cover the above protected characteristics, it doesn't always cover subsets. In a New York Times quiz last year, the paper posited that "Female sports reporters need to be hit in the head with hockey pucks" would not be considered hate speech under Facebook's policies, because while gender is a protected category, occupation is not. Facebook did tell the Times that the statement would likely be flagged for the violent threat which is under a different policy, but it's still troubling that simply targeting female sports reporters is insufficient.
Still, Facebook took pains to say that these policies are not static, and are ever-changing. According to Facebook, the content policy team meets every two weeks with various other teams within the company like engineering or operations. Depending on the issue, they would also meet with teams from legal, public policy, diversity, child and women's safety, government relations and external stakeholders like academics, researchers, counterterrorism experts and hate organization experts. With the issue of abortion, for example, they might meet with both pro-choice and pro-life groups to get a fuller understanding of a particular topic.
In a recent blog post, Facebook also says that it attempts to protect against human bias with extensive training as part of the on-boarding process. "Our reviewers are not working in an empty room; there are quality control mechanisms in place, and management on site, that reviewers can look to for guidance," it states. Facebook also conducts weekly audits to check on the decisions made. But even then, mistakes are made. "Even if you have a 99.9 percent accuracy rate, you'll still have made many mistakes every day," said Bickert."
Of course, one of the biggest issues with Facebook's community standards is that it's one set of guidelines for the whole world, which doesn't always apply to local laws. For example, Germany has much more strict laws around hate speech, so a post that would be legal elsewhere in the world would have to be made unavailable in Germany. "Our standards are global," said Bickert. "But there are times when we have to be very local in our application."
"We do think cultural context is important," she continued. "When we are hiring reviewers to cover certain languages, we have native Portugese speakers from Brazil and from Portugal, because of the different ways language is used."
This cultural context is all the more important as Facebook is growing in popularity in the developing world, where it's unfortunately used as a tool of misinformation and false rumors that can sometimes result in violent riots. Facebook says it's catching up and attempting rectify the situation, but it's understandably hard to be patient when lives are at stake.
To that end, the company will be holding several public summits around the world in the coming months. The first three forums will be in England, Paris and Berlin and will be held in mid May. There'll also be subsequent summits in India, Singapore and the US. "They're going to be very interactive," said Bickert. "We want to get their feedback and incorporate them, and make sure the team is taking them into account in policy development and updates to community standards."
"There will always be people who will try to post abusive content or engage in abusive behavior," said Bickert. "[Revealing our guidelines] is our way of saying, these things are not tolerated."
- This article originally appeared on Engadget.