advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Facebook denies reports that it protected UK far-right activists

Who is responsible for the content that appears on Facebook?

Some may say it’s the company’s policy makers that serve as the gatekeepers, but an upcoming documentary could point to the content moderators as the true enforcers.

The documentary in question is part of Channel 4’s Dispatches series in the UK, where it is set to air tonight (17th July) at 22:00 (SAST).

https://www.youtube.com/watch?v=SiECk8icT0Q

In it, Dispatches sends a member of its team undercover to work as a moderator at one of Facebook’s contractors, CPL.

The series says it has evidence that moderators actively refused to ban content from far-right activist pages in the UK, which included hate speech, child abuse, graphic violence and otherwise toxic material.

Facebook has outlined its view and policies on toxic content in the past, with it commonly believed that pages that shared or posted such material would be banned from the social media platform if they contravened their rules on more than five occasions.

According to Dispatches’ investigation that is not the case, with a surreptitious second tier of moderation that referred to as “shielded review.” In such cases, Facebook employees and not CPL moderators decided the fate of such pages and whether or not to take action.

The lightning rod in this entire situation is UK far-right organisation Britain First, which was granted shielded review, and did not have any of its toxic content pulled, or indeed have their page banned.

Equally telling is one moderators response as to why they were not banned. “They have a lot of followers so they’re generating a lot of revenue for Facebook,” noted an unnamed moderator.

Another troubling confession is the fact that moderators are instructed to ignore toxic content from users under 13 years old, unless they explicit state their age on said content.

In response to the forthcoming documentary, Facebook’s VP of Global Policy Management, Monika Bickert, admits that the company has made some mistakes in policing content, but never did so with the intention of financial benefit.

“We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention. We have been investigating exactly what happened so we can prevent these issues from happening again… It has been suggested that turning a blind eye to bad content is in our commercial interests. This is not true,” writes Bickert.

As for how Facebook will react once the documentary airs in full remains to be seen, as well as how users’ view of the company will be effected by what gets revealed.

Is this a lack of accountability on the part of Facebook, or are they simply safeguarding the right to free speech for all their users, despite how toxic they may be?

Let us know your thoughts on Twitter and Facebook.

[Source – Engadget] [Image – Facebook]

 

 

advertisement

About Author

advertisement

Related News

advertisement