Earlier this year Facebook commissioned an independent investigation into its role in violence against Rohingya people in Myanmar.

In 2017 thousands of Rohingya people were killed and the BBC reports that 700 000 Rohingya fled Myanmar after attacks on their people.

Now findings in an investigation have said that Facebook was the medium through which users were inciting offline violence and the spread of hatred online.

That investigation was conducted by Business for Social Responsibility (BSR) and its findings and recommendations have been published in the report, Human Rights Impact Assessment: Facebook in Myanmar (HRIA).

The assessment was done between May and September of this year and the UN Guiding Principles on Business and Human Rights were used as a basis for the methodology.

“This methodology included a documentation review, direct consultation with around 60 potentially affected rightsholders and stakeholders during two visits to Myanmar by BSR staff, and interviews with relevant Facebook employees. This HRIA was funded by Facebook, though BSR retained editorial control over its contents,” the report reads.

The crux of the report is that Facebook simply didn’t do enough to prevent its platform being used to incite violence offline.

This is something product policy manager at Facebook, Alex Warofka is keenly aware of.

“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more,” Warofka wrote.

The report covers five areas Facebook must focus on to prevent something like this from ever happening again and the social network outlined how it will address these areas.

Governance and accountability at Facebook

Firstly, BSR recommends Facebook create a stand-alone human rights policy. This policy should aid in formalising a leadership, governance and accountability structure within Facebook that will oversee the firm’s human rights strategy, approach and milestones.

The firm says that human rights principles such as the Universal Declaration of Human Rights and the International Covenant on Civil and Political Rights are used to dictate what is and isn’t allowed on its platform.

BSR also recommends Facebook publish human rights updates to the public periodically. The social network did not address this recommendation.

A recommendation was also made that Facebook conduct human rights impact assessments in other, high-risk markets.

Enforcing community standards

The next area of focus is the betterment of enforcing Facebook’s Community Standards.

Suggestions include stricter interpretation of Facebook’s credible violence policy as it relates to misinformation, invest in AI to better identify misinformation, and preserve removed content for use as evidence later.

Facebook has already or is in the process of implementing these suggestions.

“We have, for example, updated our credible violence policy such that we now remove misinformation that has the potential to contribute to imminent violence or physical harm. We have also undertaken research to better understand how content that doesn’t ordinarily break our rules (for example, potentially hateful content that doesn’t amount to hate speech under our policies) has the potential to incite offline harm. In this vein, we are working with partners to use CrowdTangle and other tools to analyze potentially harmful content and understand how it spreads in Myanmar,” wrote Warofka.

Meaningful engagement

The third suggestion is that Facebook engage more meaningfully with the people of Myanmar.

This includes publishing a Myanmar-specific version of the Community Standards Enforcement report, an annual or bi-annual public briefing on Facebook’s human rights strategy in Myanmar and engage with locals to better understand the tactics used to spread misinformation.

Facebook is doing this by both publishing data on the progress it has made in fighting misinformation and proactively detecting and removing hate speech.

“In the third quarter of 2018, we saw continued improvement: we took action on approximately 64,000 pieces of content in Myanmar for violating our hate speech policies, of which we proactively identified 63%—up from 13% in the last quarter of 2017 and 52% in the second quarter of this year,” Warofka said.

Not Facebook alone

While Facebook was used as the medium through which misinformation was spread, the report highlights the fact that Facebook is not solely responsible and wider change is needed.

That having been said the report also points out that to many Myanmar citizens Facebook is the internet. That tells us that while Facebook is not solely responsible it has a role to play in rectifying the matter.

“As such, BSR recommends that Facebook play an active role in advocacy efforts aimed at policy, legal, and regulatory reform in Myanmar, support the country’s transition to Unicode, and continue to invest in efforts to increase digital literacy and counter hate speech,” Warofka says.

The mention of Unicode is important because according to Facebook, Myanmar is the only country with a significant online presence which doesn’t have a Unicode standard for text.

Instead Myanmar uses Zawgyi to encode the Burmese language and that has issues.

Warofka explains, ” It makes automation and proactive detection of bad content harder, it can weaken account security, it means less support for languages in Myanmar beyond Burmese, and it makes reporting potentially harmful content on Facebook more difficult.”

For the time being Facebook has removed Zawgyi as an option for new Facebook users to combat this and the firm says it supports Myanmar’s transition to Unicode.

The social network has also said it will work with publishers and newsrooms to help build capacity and resources to help slow down the spread of misinformation.

Risk mitigation

The final suggestion deals with the future and how Facebook should prepare for it.

Elections in 2020, the rise of WhatsApp and human rights should all be top of mind for Facebook and as Warofka puts it, they are.

“Our dedicated product, engineering, partnerships and policy teams will continue to work on issues specific to Myanmar, and to address a diverse set of challenges. This includes our work to root out abuse in the run up to the country’s 2020 elections,” he said.

Perhaps most interesting is that Facebook has noted that other countries might experience issues such as this in the future and its likely it will take learnings for this Myanmar experience and apply them to other countries.

Locally speaking Facebook recently launched fact checking in South Africa to tackle the issue of false news being spread on its platform.

[Source – Facebook]