Facebook Admits It Didn’t Do Enough to Stop Myanmar Violence

Facebook has admitted that it didn’t do enough to prevent the incitement of violence and hate speech in Myanmar following a report that concluded it was used as a platform for harmful and racially inflammatory content. The report was carried out by San Francisco-based nonprofit Business for Social Responsibility and it found: “Facebook has become a means for those seeking to spread hate and cause harm, and posts have been linked to offline violence.” Facebook was used to incite violence and coordinate harm in Myanmar—much of which was directed toward the Rohingya, the Muslim minority in that country. Alex Warofka, Facebook product policy manager, said the report showed that “prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.” The company said it had hired 100 native speakers to review Myanmar content and that, in 2018, it took down 64,000 pieces of content.