Tuesday, April 24, 2018

Now Facebook Will Tell You Why You're In 'Facebook Jail'

If you’ve engaged in political discussions on Facebook, the chances are good that you have had posts deleted and even spent some time in “Facebook jail,” the euphemism for temporary bans that put users in “timeout” when they get caught posting something inappropriate. A big problem is that Facebook’s community standards were vague and unevenly applied. Many users didn’t know what standard they were accused of violating and had no way to appeal Facebook’s decision. Now that has changed.

Today Facebook released its full community standards playbook, an 8,000-word guide that is used by the company’s 7,500 moderators. The guide is split into six sections that are based on specific types of posts that can be removed. These include violence and criminal behavior, safety, objectionable content, integrity and authenticity, intellectual property and content-related requests. The company says that the standards will continue to be updated frequently.

Many of the problematic posts are straightforward, but there are many exceptions. For example, Facebook says, “we default to removing sexual imagery,” but “make allowances for the content” that fits certain criteria such as when it is “a form of protest, to raise awareness about a cause, or for educational or medical reasons.”

“We remove content that glorifies violence or celebrates the suffering or humiliation of others,” Facebook says, but “we allow graphic content (with some limitations) to help people raise awareness about issues.”

Regarding the sticky issue of hate speech, Facebook lists a number of protected categories that include “race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, and serious disability or disease” as well as “some protections for immigration status.” Facebook does not allow attacks, defined as “violent or dehumanizing speech, statements of inferiority, or calls for exclusion or segregation,” on these groups.

Hate speech is divided into three tiers. Tier 1 attacks are violent speech. Tier 2 attacks are “statements of inferiority.” Tier 3 attacks are calls to exclude protected groups.

There have been reports of Facebook users reposting threats that they have received and then being banned. Facebook says that if hate speech is reposted for the purpose of raising awareness, “we expect people to clearly indicate their intent, which helps us better understand why they shared it.” Otherwise the content may be removed.

When it comes to “fake news,” Facebook allows it. Noting that there is “a fine line between false news and satire or opinion,” the company says, “we don't remove false news from Facebook but instead, significantly reduce its distribution by showing it lower in the News Feed.”

Engadget points out that Facebook addressed accusations of viewpoint bias that results in uneven application of the rules in a recent blog post. “Our reviewers are not working in an empty room; there are quality control mechanisms in place, and management on site, that reviewers can look to for guidance,” the company said.

When the company makes a mistake, new policies will allow users to appeal the removal of content, Monika Bickert, Facebook’s vice president of product policy and counter-terrorism, told Reuters. In the past, only the removal of accounts, groups and pages could be appealed. Bickert also said that Facebook would begin to provide specific reasons for why the content was removed.

It’s important to remember that the First Amendment does not apply to Facebook and social media. The constitutional guarantee of free speech applies only to government restrictions. Private companies like Facebook have the right to police their communities as they see fit, but unless the companies respect freedom of speech they will ultimately chase users away to more open platforms. There has already been much talk online  from conservatives about leaving Facebook and Twitter in the wake of the news that the companies favor liberal viewpoints.  

Facebook’s new openness about community standards seems to be an attempt to fix the problem of vague standards for users. As Bickert said, “You should, when you come to Facebook, understand where we draw these lines and what’s OK and what’s not OK.”


Now if Facebook would just apply those standards evenly across the political spectrum. 

Originally published on The Resurgent

No comments: