A review of how Facebook and Instagram have been used to spread content that increases the risk of violence in Ethiopia is now being considered by Facebook’s owner, Meta Platforms.
Set up by the company to address criticism over its content moderation practices, the board makes binding decisions on a small number of challenging cases, and provides non-binding recommendations.
Since whistleblower Frances Haugen released internal documents showing Meta’s struggles to control content in countries where such speech is most likely to cause harm, such as Ethiopia, it has come under scrutiny from lawmakers and regulators.
A year-long conflict between Ethiopian government forces and rebel forces in northern Tigray has caused thousands of deaths and millions of displaced people. read more
In response to the board’s December recommendations regarding a case involving content posted in Ethiopia, Facebook said it had “invested significant resources” in finding and removing potentially harmful content.
Last month, Meta’s oversight board upheld its decision to remove a post alleging that ethnic Tigrayan civilians committed atrocities in Ethiopia’s Amhara region. Meta had restored the post after the user appealed to the board, so the company had to remove it again.
Despite taking the post down, Meta said it disagreed with the board’s reasoning that it should have been removed because it was an “unverified rumour” that significantly increased the danger of imminent violence. It said this would impose “a journalistic publishing standard on people.”
An oversight board spokesman said in a statement: “Meta’s existing policies prohibit rumours that contribute to imminent violence that cannot be debunked in a meaningful timeframe, and the Board made recommendations to ensure these policies are effectively applied in conflict situations.
“Rumors alleging an ethnic group is complicit in atrocities, as found in this case, have the potential to lead to grave harm to people.”
A Human Rights Due Diligence Assessment was recommended by the board, due in six months, examining Meta’s language capabilities in Ethiopia and measures taken to prevent the misuse of its services there.
Nonetheless, according to the company, not all of the board’s recommendations were feasible due to timing, data science or approach. It planned to continue its existing human rights due diligence and to provide an update on whether it could act on the board’s recommendation within the next few months.