Facebook’s confusing and often criticised moderation process shows that the social media giant’s policies are full of holes which allow inappropriate content to thrive on the site, according to a series of leaked internal documents. In the documents obtained by The Guardian, Facebook is shown to have policy on a wide range of issues, including...
In the documents obtained by The Guardian, Facebook is shown to have policy on a wide range of issues, including hate speech, animal cruelty, graphic violence, sex, terrorism and holocaust denial.
Facebook acts differently depending on whether the content is a video or photograph. For example, an image of animal abuse may be ignored to remain on the site, whereas a video may be marked as disturbing. The trend is noted across the different categories.
Last year, Facebook was criticised for the removal of an iconic Vietnam War image of a naked girl following a napalm attack. Under current policy, images of child abuse can remain on the site granted they are not celebratory or sadistic. The ambiguity within the policy comes from the personal belief of the individual moderator of how the content is viewed.
In response to the leaked documents Monika Bickert, Head of Global Policy Management, said in a statement that there were many variables when it comes to moderation.
“In the UK, being critical of the monarchy might be acceptable. In some parts of the world it will get you a jail sentence. Laws can provide guidance, but often what’s acceptable is more about norms and expectations. New ways to tell stories and share images can bring these tensions to the surface faster than ever.”
Ms Bickert said that the standards expected on the site changed over time to better reflect varying societal moods and standards. While the company accepted its role to keep the site safe for users, the company tried to stay objective as it walked the line between under and over censorship.
“Technology has given more people more power to communicate more widely than ever before. We believe the benefits of sharing far outweigh the risks. But we also recognise that society is still figuring out what is acceptable and what is harmful, and that we, at Facebook, can play an important part of that conversation.”
The leaking of the documents follows the announcement that the social media company would be hiring a further 3000 moderators, bringing the total number of those trawling the site to 4500.
All of the internal documents that were released can be viewed here.
For more news from NewsMediaWorks, click here.