Facebook is improving its alerts regarding misinformation. These improvements will provide a bit more context and put a bit more responsibility on users to insure they aren’t spreading misinformation.
The first of those updates is a pop-up that will appear when you join a Page that has repeatedly shared false information. While Facebook has done this before, this updated pop-up provides more contextual information as well as a link to more information about Facebook’s fact-checking programme.
But it’s not just Pages that are in the firing line, now your account could be too.
“Starting today, we will reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of our fact-checking partners. We already reduce a single post’s reach in News Feed if it has been debunked,” said Facebook.
Beyond that, if you share content that is later found to be false, the notification you receive will contain a lot more information that it previously did.
“The notification includes the fact-checker’s article debunking the claim as well as a prompt to share the article with their followers. It also includes a notice that people who repeatedly share false information may have their posts moved lower in News Feed so other people are less likely to see them,” the social network explained.
Nearly a year ago, Twitter labelled a tweet from former US president Donald Trump as misinformation. The move prompted Trump to set his sights on social media firms and in response Facebook said it would not participate in regulating what was said on its platform.
“I just believe strongly that Facebook shouldn’t be the arbiter of truth of everything that people say online,” Mark Zuckerberg told Fox News nearly a year ago.
It’s good to see that Facebook has come to the realisation that it can’t simply decide not to fight misinformation on its platform because its CEO doesn’t feel like it.
[Source – Facebook]