Facebook fails to detect hate speech against Rohingya - report

A leading advocacy group claims that the platform’s ability to detect Burmese-language hate speech “remains abysmally poor.”

Facebook's parent company Meta Platforms Inc. said it has invested in improving its safety and security controls in Myanmar.
AP

Facebook's parent company Meta Platforms Inc. said it has invested in improving its safety and security controls in Myanmar.

A new report has found that Facebook failed to detect blatant hate speech and calls to violence against Myanmar's Rohingya Muslim minority in advertisements submitted to run on its platform.

The report shared exclusively with The Associated Press showed the rights group Global Witness submitted eight paid ads for approval to Facebook, each including different versions of hate speech against Rohingya. All eight ads were approved by Facebook to be published.

The group pulled the ads before they were posted or paid for, but the results confirmed that despite its promises to do better, Facebook's leaky controls still fail to detect hate speech and calls for violence on its platform.

The army conducted what it called a clearance campaign in western Myanmar's Rakhine state in 2017 after an attack by a Rohingya insurgent group. More than 700,000 Rohingya fled into neighboring Bangladesh and security forces were accused of mass rapes, killings and torching thousands of homes.

Genocide

Also Monday, US Secretary of State Antony Blinken announced that the US views the violence against Rohingya as genocide. 

The declaration is intended to both generate international pressure and lay the groundwork for potential legal action, Blinken said. 

Experts say such ads have continued to appear and that despite its promises to do better and assurances that it has taken its role in the genocide seriously, Facebook still fails even the simplest of tests — ensuring that paid ads that run on its site do not contain hate speech calling for the killing of Rohingya Muslims.

READ MORE: US declares Myanmar military committed genocide against Rohingya Muslims

Shocking posts

“The current killing of the Kalar is not enough, we need to kill more!” read one proposed paid post from Global Witness, using a slur often used in Myanmar to refer to people of east Indian or Muslim origin.

“They are very dirty. The Bengali/Rohingya women have a very low standard of living and poor hygiene. They are not attractive,” read another.

“These posts are shocking in what they encourage and are a clear sign that Facebook has not changed or done what they told the public what they would do: properly regulate themselves,” said Ronan Lee, a research fellow at the Institute for Media and Creative Industries at Loughborough University, London.

READ MORE: Rohingya praise US for recognising their genocide in Myanmar

Loading...

The eight ads from Global Witness all used hate speech language taken directly from the United Nations Independent International Fact-Finding Mission on Myanmar in their report to the Human Rights Council. Several examples were from past Facebook posts. 

The fact that Facebook approved all eight ads is especially concerning because the company claims to hold advertisements to an “even stricter” standard than regular, unpaid posts, according to their help center page for paid advertisements.

“I accept the point that eight isn’t a very big number. But I think the findings are really stark, that all eight of the ads were accepted for publication,” said Rosie Sharpe, a campaigner at Global Witness. “I think you can conclude from that that the overwhelming majority of hate speech is likely to get through.”

Facebook's parent company Meta Platforms Inc. said it has invested in improving its safety and security controls in Myanmar, including banning military accounts after the Tatmadaw, as the armed forces are locally known, seized power and imprisoned elected leaders in the 2021 coup.

READ MORE: Scores of Rohingya Muslims land on Indonesian beach in rickety boat

Route 6