Nearly a year after violence exploded against Myanmar’s Rohingya Muslim population, Facebook has finally taken some form of sustained action to prevent the country’s military from using the platform to spread misinformation and hate speech critical of the minority group.
On Monday, the social media giant announced it was banning 18 accounts and 52 pages associated with the Burmese military, immediately after a United Nations report accused it of carrying out genocide against the Rohingya. Among those banned from Facebook was Senior General Min Aung Hlaing, Myanmar’s commander-in-chief.
“International experts, most recently in a report by the UN Human Rights Council…have found evidence that many of these individuals and organizations committed or enabled serious human rights abuses in the country,” a Facebook blog post on Monday read. “We want to prevent them from using our service to further inflame ethnic and religious tension.”
While Facebook’s response might seem speedy in relation to the UN report, the company has spent the last year largely ignoring repeated warnings about how its platform was being used to spread hate speech and fake news about the Rohingya minority.
In April, for instance, research by the digital analyst Raymond Serrato showed that anti-Rohingya hate speech began to explode in Myanmar around mid-Augst 2017, with one anti-Rohingya group experiencing a 200 percent increase in post interactions. A separate analysis by the Institute for War and Peace Reporting found an abundance of fake news, derogatory anti-Rohingya terms and signs denoting “Muslim-free” areas.
In October last year, The New York Times documented how ultranationalist monk Ashin Wirathu had been aggressively pushing a false narrative on Facebook about how the Rohingya were dangerous outsiders. Wirathu had already been barred by the Myanmar government from preaching publicly on grounds of hate speech at that point.
The UN itself has previously blamed Facebook for its role in allowing lies and misinformation about the Rohingya — more than 700,000 of whom have fled into neighboring Bangladesh to escape the brutal crackdown — to spread across its platform.
“[Social media] has…substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public,” Marzuki Darusman, chairman of the UN Independent International Fact-Finding Mission on Myanmar, noted in March. “As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media.”
For most in Myanmar, Facebook is the internet. From 2014 to 2018, the country’s Facebook usage ballooned from 2 million to 30 million, thanks in large part to the app coming pre-installed on affordable cell phones in the country. But the country has not experienced the same upward swing in internet or computer literacy, leaving the population vulnerable to fake news and disinformation campaigns.
Facebook CEO Mark Zuckerberg has himself admitted Facebook has not done nearly enough to halt the spread of hate speech and fake news in Myanmar. Earlier this year, he told regional activist groups the company was trying to change that.
“In addition to improving our technology and tools, we have added dozens more Burmese language reviewers to handle reports from users across all our services. We have also increased the number of people across the company on Myanmar-related issues and we now we have a special product team working to better understand the specific local challenges and build the right tools to help keep people there safe,” he wrote, in a letter obtained by the Times.
He added, “We are grateful for your support as we map out our ongoing work in Myanmar, and we are committed to working with you to find more ways to be responsive to these important issues.”