advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

This is how Facebook decides what you can post – report

The rules that Facebook moderators refer to when faced with removing content such as hate speech, violence and more from the platform have been outed.

The Facebook Files as The Guardian calls them – detail exactly what sort of content can be posted to Facebook.

For instance, a post which reads “someone shoot Trump” would be deleted because President Trump is a head of state. However, a post which reads “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat” would not be deleted as according The Guardian’s reading of these rules Facebook doesn’t see the statement as a credible threat.

Nudity is also seemingly a big no-no in the eyes of Facebook. Handmade art showing nudity is fine but if your art is digital and contains nudity your content might be removed. Videos showing abortions are fine as well, so long as there is no nudity.

With that said, videos of violent deaths are left untouched as they might raise awareness surrounding mental illness. Images of non-sexual physical abuse and bullying of children are permitted so long as they are not sadistic or celebratory.

As for animal abuse, Facebook allows users to post that content so long as it isn’t sadistic or celebratory in nature. The reason for this – once again – is tied to creating awareness around these issues.

Operating in extremes

At first glance these guidelines seem rather harsh for some content and lax for others. The reason for this comes down to context.

For instance, a news site might post a video showing footage of a child being bullied as part of the evidence for a story. In that context the imagery wouldn’t be removed, and rightly so.

What the Facebook Files reveal is something more alarming, just how hard moderators have to work to flag content. At time of writing the social media platform employees some 4 500 content moderators according to The Guardian who review millions of posts everyday.

Facebook does take measures to make sure moderators have the mental health faculties to review what could be the most vile content on the platform but with just 4 500 moderators monitoring nearly two billion users the work load is immense.

From what we can deduce from the Facebook Files it appears as if the platform is trying it’s best to keep free speech alive but at the same time its trying to make sure folks aren’t spreading dissent, violence or abuse which can’t be an easy job.

advertisement

About Author

advertisement

Related News

advertisement