Monday, 19 November, 2018

Facebook publishes new content guidelines

Facebook CEO Mark Zuckerberg at the company’s headquarters in Menlo Park California Facebook CEO Mark Zuckerberg at the company’s headquarters in Menlo Park California
Theresa Hayes | 24 April, 2018, 19:38

Facebook has gone public on how it decides what is and is not allowed on its social network, publishing, in full, the Community Standards guidelines it issues to its employees responsible for policing content, looking for instances of things such as hate speech, child abuse and terrorism.

Ms Bickert said that Facebook intends to extend the process "by supporting more violation types, giving people the opportunity to provide more context that could help us make the right decision, and making appeals available not just for content that was taken down, but also for content that was reported and left up". To help users to understand what's allowed and what's not, the company has published its Community Standards for everyone to read. The social networking giant's guidelines cover everything from violence and bullying to privacy and copyright. The move to involve Facebook users more on standards for removing content comes as the social network fends off criticism on an array of fronts, including handling of people's data, spreading "fake news", and whether politics has tinted content removal decisions.

Now, the company is providing the longer document on its website to clear up confusion and be more open about its operations, said Monika Bickert, Facebook's vice president of product policy and counter-terrorism. The company is also giving users the right to appeal its decisions on individual posts so they can ask for a second opinion when they think Facebook has made a mistake.

The introduction of an appeals process is an important change of tack by Facebook, and it's likely that the company will find itself utterly inundated with appeals from people who feel they have been unjustly censored.

"There's always more that we can be doing in this space and that's really what today is all about". The appeal process to be built up during the year ahead will extend that right to individual posts, according to Cummiskey. Some of the credible violent threats include sexual violence, promote terrorism, poaching of endangered species, selling or buying firearms, attempts to buy marijuana, prescription of drug prices for sale, self-injury posts, and sexual mistreatment of a minor, multiple homicides at different times and location, cannibalism videos, nudity, promotion of hate groups and other explicit posts.

If your photo, video or post has been removed because it violates our Community Standards, you will be notified, and given the option to request additional review.

Facebook promises that a person will review the post within 24 hours to assess whether its algorithms have missed the mark. "This is going to be a way to give people a real voice in this process".

If we've made a mistake, we will notify you, and your post, photo or video will be restored.

Facebook had earlier banned most of the explicit actions on community standard page which sketched out company's standard in broad strokes. That's why we have developed a set of Community Standards that outline what is and is not allowed on Facebook.

The company considers changes to its content policy every two weeks at a meeting called the "Content Standards Forum", led by Bickert.

Another challenge is accurately applying our policies to the content that has been flagged to us. Every week, our team seeks input from experts and organizations outside Facebook so we can better understand different perspectives on safety and expression, as well as the impact of our policies on different communities globally. This can be challenging given the global nature of our service, which is why equity is such an important principle: we aim to apply these standards consistently and fairly to all communities and cultures.