Facebook's Secret List Of Banned Content Has Finally Been Published

ALLIANCE  DPA  AP IMAGES

ALLIANCE DPA AP IMAGES

This is an exercise of saying, here's where we draw the lines, and we understand that people in the world may see these issues differently. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines - and the decisions we make - over time.

Content where someone "admits to personal use of non-medical drugs" will be removed too, along with videos of people "cursing at a minor". In addition, she writes, the team seeks input from experts and organizations outside Facebook so that they can better understand different perspectives on safety and expression, as well as the impact of Facebook's policies on different communities globally. What has not changed - and will not change - are the underlying principles of safety, voice and equity on which these standards are based. Compared with private citizens, public figures need to meet a more stringent standard (actual malice) to prove damages in a libel lawsuit. Facebook should also be a place where people can express their opinions freely, even if some people might find those opinions objectionable. "Ninety-nine percent of terrorist content is removed before it is ever flagged by our community of users". "We outline these principles explicitly in the preamble to the standards, and we bring them to life by sharing the rationale behind each individual policy". With a team of just 7500 content reviewers, Facebook has quite a task on its hands.

The new appeal process will first focus on posts remove on the basis of nudity, sex, hate speech or graphic violence.

Currently, people who have their posts taken down receive a generic message that says that they have violated Facebook's community standards.

Facebook promises that a person will review the post within 24 hours to assess whether its algorithms have missed the mark. But its AI tools aren't close to the point where they could pinpoint subtle differences in context and history - not to mention shadings such as humor and satire - that would let them make judgments as accurate as those of humans.

In May, Facebook will launch Facebook Forums: Community Standards, a series of public events in Germany, France, the UK, India, Singapore, the U.S. and other countries where the company will get people's feedback directly. "I have actually had conversations where I talked about our standards and people said, 'I didn't actually realize you guys have policies, '" said Bickert.

More news: Expert Analysts Consensus Opinions for: Wells Fargo & Company

The company's censors, called content moderators, have been chastised by civil rights groups for mistakenly removing posts by minorities who had shared stories of being the victims of racial slurs.

Bickert's team has been working for years to develop a software system that can classify the reasons a post was taken down so that users could receive clearer information - and so Facebook could track how many hate speech posts were put up in a given year, for example, or whether certain groups are having their posts taken down more frequently than others.

Finally, for now at least. For years, we've had Community Standards that explain what stays up and what comes down. If on the other hand you're posting content around sporting firearms or you're a legitimate business then you can post content. "They told us they would get back to us when they had something new to say". The company now uses software to identify duplicate reports, a timesaving technique for reviewers that helps them avoid reviewing the same piece of content over and over because it was flagged by many people at once.

All interesting and no doubt useful for the average Facebook user to know.

Fake or false news is also prohibited. We will update this piece when we get one.

Latest News