In Myanmar, Facebook has become an essential communication tool and a vital method of spreading the word about the ethnic cleansing of Rohingya Muslims, who are facing violence from the country’s military. But activists inside and outside Myanmar have told the Daily Beast that Facebook is removing posts documenting the violence, supposedly for violating community standards, cutting off a vital means of organizing the community and documenting the ethnic cleansing — and raising questions, again, about Facebook’s role as a distributor of information.
Facebook told the Daily Beast that the company is looking into it, and pointed to its community standards — but those community standards are vague and unevenly enforced, and, as we’ve written about before, depictions of or discussions of violence fall into a large gray area in the middle. One activist reported that his descriptions of military atrocities, such as “#Rohingya homes in the downtown of #Maungdaw are still being set ablaze by the #Myanmar military & #Rakhine extremists,” were removed by Facebook. Another user had a poem about the violence taken down, though it’s difficult to see how the poem violated Facebook’s standards since it only described the turmoil in vague, non-graphic terms.
It’s unclear why the posts are being targeted, but it’s possible for dedicated groups to brigade a given post or user, reporting content or people to the service en masse to give the impression of an urgent violation. The examples of posts taken down, provided in the Daily Beast’s report, are clearly unwarranted, but the overzealous silencing of users also comes on the heels of new Facebook rules about what can and can’t be monetized on the platform. Descriptions of Myanmar’s current upheaval, even for informative purposes, cannot be. Facebook’s implicit position is clear: Anything that might be distressing is discouraged, at the very least.