Last week Friday the world was shocked by the news of two mosque shootings in the New Zealand city of Christchurch, but it appears as if Facebook wants to exploit the tragic terror attack as a platform to show how much good work it’s doing.

To that end the social media giant said it removed roughly 1.5 million videos from its site within the first 24 hours of the shootings.

Facebook took to Twitter to confirm the numbers, with the company’s Mia Garlick also adding that 1.2 million videos were removed at the point of upload. Said videos were removed if they praised and supported the shooter’s actions, with a combination of auto detection and human moderators being used.

What Facebook fails to mention, however, and what TechCrunch points out, is that 300 000 videos pertaining to the New Zealand mosque shooting slipped through their safety net.

That represents an estimated 20 percent failure rate, and now critics of the social media platform are asking them to release more accurate statistics, instead of these “vanity” ones they shared recently.

As such, critics want Facebook to release figures based on how many shares, views, replies, comments, etc. these videos garnered before they were taken down. This would provide a more accurate representation of how quickly content around this tragic event spread on Facebook, and could potentially demonstrate how much work they still have to do when it comes to policing their own platform.

This isn’t the first time that the site has come under fire, with its live streaming offering in particular being criticised, as users have shared abusive and disturbing content for several hours before the site had it taken down.

We understand that Facebook cannot remove every single video, but their latest proclamations paint a very skewed picture of how things are actually unfolding on their website.

Hopefully they’ll be more transparent in future, and not try to jump on a tragedy in order to showcase the great job they think they’re doing.