Rights organisations are urging Meta Platforms to grasp the chance to enhance its content moderation in Africa after the social media giant’s key third-party contractor in the region announced that it will no longer screen harmful posts for the company.
Sama, a Kenya-based outsourcing organisation, announced on January 10 that it would cease providing content moderation services to the owner of Facebook, WhatsApp, and Instagram in March in order to focus on data labelling activities.
Sama said that it would lay off 3% of its workforce, or approximately 200 individuals, in order to streamline operations and increase efficiency. It will continue to supply Meta with data labelling services.
The announcement comes as both Sama and Meta face a lawsuit in Kenya for alleged labour violations and hindering workers from unionising. Another case accuses Meta of allowing violent posts on Facebook to grow, inciting civil war in neighbouring Ethiopia. Both firms have defended their track records.
Digital rights activists said Meta’s efforts to combat harmful content in African countries were woefully inadequate in comparison to affluent nations, and they urged the business to dramatically improve its moderation methods.
“With the exit of Sama, now would be a good chance for Meta to put things right and ensure better labour conditions for African moderators in the region,” said Bridget Andere, Africa policy analyst at Access Now.
She added – “Meta should increase the number of moderators for the region to adequately cover local languages and dialects, and also be more transparent about their algorithms which are promoting harmful content”.
Meta did not say whether it has found a replacement third-party contractor for East Africa, but stated Sama’s departure will have no negative impact on users of its social media services.
“We respect Sama’s decision to exit the content review services it provides to social media platforms,” said a Meta representative to Reuters
“We’ll work with our partners during this transition to ensure there’s no impact on our ability to review content.”