By Kanishka Singh
(Reuters) – Facebook Inc said on Tuesday that it was preparing for Myanmar’s general election in November by improving the detection and removal of hate speech and content that incites violence and preventing the spread of misinformation.
The company said in a blog that between now and Nov. 22, it would remove “verifiable misinformation and unverifiable rumours” that are assessed as having the potential to suppress the vote or damage the “integrity” of the electoral process.
“For example, we would remove posts falsely claiming a candidate is a Bengali, not a Myanmar citizen, and thus ineligible,” Facebook said.
The platform came under fire in Myanmar after a military-led crackdown in 2017 that forced more than 730,000 Rohingya Muslims to flee the country. U.N. investigators said Facebook played a key role in spreading hate speech that fuelled the violence.
The company has long said it works to stop hate speech.
Facebook said it was working with two partners in Myanmar to verify the official Facebook pages of political parties. It now has three fact-checking partners in Myanmar: BOOM, AFP Fact Check and Fact Crescendo.
The company took action against 280,000 pieces of content in Myanmar for violating its standards prohibiting hate speech in the second quarter of this year, up from 51,000 such pieces that it took action against in the first quarter.
It also said it introduced a new feature that limits the number of times a message can be forwarded to five.
The feature is now available in Myanmar and, over the course of the next few weeks, would be made available to Messenger users worldwide, the company added in the blog.
The 2020 elections in Myanmar, scheduled for Nov. 8, will be its second democratic election since the end of almost half a century of strict military rule.
(Reporting by Kanishka Singh in Bengaluru; Editing by Christian Schmollinger, Robert Birsel)