Legal experts have launched an investigation into what they describe as “dire” working conditions for content moderators employed to monitor Meta platforms in Ghana.
The probe, led by Accra-based consultancy Agency Seven Seven and UK legal non-profit Foxglove, follows rising concerns about the psychological toll on workers reviewing violent and disturbing material online.
The moderators, who work for a third-party firm contracted by Meta, are tasked with filtering harmful content from platforms like Facebook and Instagram, including graphic images of child abuse and murder. Lawyers say they are often forced to do so without sufficient mental health support.
“What we are talking about here is potential psychological injury,” said Carla Olympio, founder of Agency Seven Seven, who has recently held discussions with affected workers.
Martha Dark, founder of Foxglove, told AFP, “Everyone is suffering in terms of their mental health — whether that’s post-traumatic stress disorder, insomnia, depression, suicidal thoughts, and more. The situation is pretty dire.”

This new investigation comes in the wake of several high-profile lawsuits in Kenya against Meta over similar allegations.
Kenya’s now-closed moderation hub, run by a different contractor, was at the centre of claims involving illegal dismissals, union-busting, and mental health neglect. One separate case also alleges that Facebook’s algorithm promoted hate speech in Ethiopia, leading to real-world violence.
Although Meta’s content moderation operations in Ghana had been largely kept out of the public eye until recently, lawyers now estimate around 150 moderators work in Accra for Majorel, a subcontractor owned by the French firm Teleperformance.
Employees reportedly face shared accommodation, poor pay transparency, and pressure to review large volumes of traumatic content to qualify for bonus pay.
One moderator, who moved from East Africa to Ghana for the job, told The Guardian he was driven to attempt suicide due to the psychological burden of the work.
Despite requests from AFP, neither Meta nor Teleperformance responded to questions. However, Teleperformance told The Guardian that its systems included a strong well-being programme with licensed psychologists, as well as competitive compensation. Meta maintained it takes the welfare of its content reviewers seriously.
Foxglove, also supporting the lawsuits in Kenya, has argued that content moderation can be done safely with clear exposure limits and access to appropriate psychiatric care, citing international best practices, including restrictions placed on Irish police dealing with child exploitation material.
With legal proceedings continuing in Kenya and a new spotlight now cast on Ghana, Meta’s outsourced labour model in Africa is facing renewed legal and ethical scrutiny.