Coming under fire for not doing enough to stop the misuse of its platform for spreading hate in Myanmar, Facebook has promised to invest more in people, technology and partnerships to make the social network a safer place for people to express themselves.
Facebook on Monday said it was looking into establishing a separate policy that defines its approach to content moderation with respect to human rights, as per recommendation from San Francisco-based nonprofit Business for Social Responsibility (BSR).
BSR was commissioned by Facebook to make an independent human rights impact assessment on the role of the social network’s services in Myanmar.
“We’re also working to hire additional human rights specialists to strengthen engagement with and solicit input from NGOs, academia, and international organisation,” Alex Warofka, Product Policy Manager at Facebook said in a statement.
Also read: Facebook sets up a ‘war room’ to fight fake news during elections
In its report, BSR concluded that prior to this year, Facebook was not doing enough to help prevent its platform from being used to foment division and incite offline violence. Facebook agreed with this assessment.
“We agree that we can and should do more,” Warofka said.
In Myanmar, “Facebook has become a means for those seeking to spread hate and cause harm, and posts have been linked to offline violence”, according to the assessment by BSR.
In the News: 30 Facebook and 85 Instagram accounts blocked for meddling with elections
A large proportion of this hate speech has been directed towards the Rohingya Muslims.
In April, the Guardian reported that hate speech on Facebook in Myanmar had exploded during the Rohingya crisis, which was caused by a crackdown by the military in Rahkine state in August 2017.
Tens of thousands of Rohingya were killed, raped and assaulted, villages were razed to the ground and more than 700,000 Rohingya fled over the border to Bangladesh, the report said.
But over the course of this year, Facebook has taken some corrective action, the BSR report said.
In the News: Facebook is now bringing Safety Check feature to Workplace messaging service
“In the third quarter of 2018, we saw continued improvement: we took action on approximately 64,000 pieces of content in Myanmar for violating our hate speech policies, of which we proactively identified 63 per cent — up from 13 per cent in the last quarter of 2017 and 52 per cent in the second quarter of this year,” Warofka said.
BSR said that Facebook should improve enforcement of its Community Standards, the policies that outline what is and is not allowed on Facebook.
“Core to this process is continued development of a team that understands the local Myanmar context and includes policy, product, and operations expertise,” Warofka said.
Earlier this year, Facebook established a dedicated team across product, engineering, and policy to work on issues specific to Myanmar.
Facebook promised that it would to grow the team of native Myanmar language speakers reviewing content to at least 100 by the end of 2018.
“We have now hired and onboarded 99 of these reviewers. This team is making a difference, improving the development and enforcement of our policies,” Warofka said.
Facebook has about 20 million users in Myanmar.