advertisement
Facebook
X
LinkedIn
WhatsApp
Reddit

Facebook will highlight a satire provision in its terms of service

Earlier this year Facebook’s Oversight Board looked into an appeal filed by a user after their content was taken down.

The content in question was a version of the Two Buttons template (an example of which we have created below) that depicted Turkey having to choose between “The Armenian Genocide is a lie” and “The Armenians were terrorists who deserved it”.

The post was removed because, according to Facebook it violated its policy on hate speech.

“We do not allow hate speech on Facebook, even in the context of satire, because it creates an environment of intimidation and exclusion, and in some cases, may promote real-world violence,” wrote Facebook.

However, it seems even Facebook’s moderators couldn’t decide how the content should be classified as the post was later said to violate the Cruel and Insensitive Community Standard. The post was removed and the user was informed.

Following an appeal Facebook found that the post had been removed under the basis of a Hate Speech Community Standard and the user hadn’t been informed of this.

Further to this, Facebook said that the post isn’t covered by an exception where hateful content can be shared to condemn or raise awareness of the problem.

“The majority of the Board, however, believed that the content was covered by this exception. The ‘two buttons’ meme contrasts two different options not to show support for them, but to highlight potential contradictions. As such, they found that the user shared the meme to raise awareness of and condemn the Turkish government’s efforts to deny the Armenian genocide while, at the same time, justifying these same historic atrocities. The majority noted a public comment which suggested that the meme “does not mock victims of genocide, but mocks the denialism common in contemporary Turkey, that simultaneously says the genocide did not happen and that victims deserved it. The majority also believed that the content could be covered by Facebook’s satire exception, which is not included in the Community Standards,” wrote the Oversight Board.

As a result of these findings, the Oversight Board has advised that Facebook implement a satire exception in the public language of its Hate Speech Community Standard.

The full suite of recommendations follows on below:

  • Inform users of the Community Standard enforced by the company. If Facebook determines that a user’s content violates a different Community Standard to the one the user was originally told about, they should have another opportunity to appeal.
  • Include the satire exception, which is not currently available to users, in the public language of its Hate Speech Community Standard.
  • Adopt procedures to properly moderate satirical content while taking into account relevant context. This includes providing content moderators with access to Facebook’s local operation teams and sufficient time to consult with these teams to make an assessment.
  • Let users indicate in their appeal that their content falls into one of the exceptions to the Hate Speech policy. This includes exceptions for satirical content and where users share hateful content to condemn it or raise awareness.
  • Make sure that appeals based on policy exceptions are prioritised for human review.

Facebook has already responded to these recommendations with some being assessed for feasibility and others, such as the satire exception, being implemented fully.

However, some of Facebook’s statements are concerning.

For example, Facebook states that it will find a better way to inform users why their content has been removed.

“We need to explore the benefit to user experience that could come from informing users of multiple violations and multiple appeal opportunities resulting from a single piece of content. Additionally, changing the technical ability, process, and training for how content moderators select policy violations for a piece of content, and the appeals that may follow, creates new operational complexity that we need to evaluate,” explained Facebook.

Shame, Zuckerberg et al have to do some additional work to insure users know when they break the rules and how they broke them. This really should be a basic feature of Facebook, but clearly the firm bans first and asks questions later.

Something we can sympathise with is determining when content is satire and when it is malicious. Intent is a very tricky thing to quantify and sadly that’s not something we’ve managed to teach computers. That means Facebook is likely going to have to bring in more human moderators or slow its existing moderators down.

“Given the context-specific nature of satire, we are not immediately able to scale this kind of assessment or additional consultation to our content moderators. We need time to assess the potential tradeoffs between identifying and escalating more content that may qualify for our satire exception, against prioritizing escalations for the highest severity policies, increasing the amount of content that would be escalated, and potentially slower review times among our content moderators,” the social network said.

While this is all tricky terrain to navigate, it’s a long time coming.

[Source – Oversight Board]

advertisement

About Author

advertisement

Related News

advertisement