Without content moderation, social media platforms breed criminal activity. All platforms are moderated by men and women who put themselves on the frontlines to perform this crucial role. They monitor what is posted, who posts it, and the impact of those posts. If a post violates community guidelines, moderators delete and remove it from the platform. The heroic young African content moderators who undertake this invisible work are seldom recognised: many people are not aware that content moderation takes place 24/7 and is principally carried out by human beings, not an algorithm.

Like any other community, social media platforms are made up of all kinds of people: teachers, doctors, students, religious people and wrongdoers. Law enforcement usually ensures people abide by the law, but maintaining order on social media is especially hard because it is a virtual, borderless community that is very active. At any hour of the day or night, thousands of people are posting harmful content online, such as child abuse, hate speech, bullying and fraud.

We content moderators are the online world’s police officers. Our job is to protect the online community by deleting harmful content before it spreads and causes even more suffering. The job is very risky. It involves watching graphic content for hours at a time – days and nights watching child abuse, suicide, rape, all kinds of violence and evil acts. Our working conditions aggravate the wounds inflicted on our mental health. Shifts last almost ten hours a day – that is, ten hours at a time of watching graphic content. No wonder it ends up affecting our mental health. Sometimes we find it difficult to sleep. We are mentally disturbed, we get nightmares.

In October 2023, a report by Euronews stated that 20% of the moderators hired by Meta in Barcelona, Spain, are on sick leave due to psychological trauma.¹ In an article linking the case brought by Spanish content moderators with other cases in Kenya and the U.S., The Verge scathingly described the content moderation industry as “outsourced-burn-out-as-a-service”.² African moderators sit at Meta computers, sometimes for over ten hours a day, watching suicide, rape, child abuse, and have fewer labour and social protections than their European and American counterparts. African governments and the African Union must step in and regulate before it is too late. If they wish to continue to benefit from the business brought by tech companies, they must recognise content moderators as a legitimate part of the African workforce and enforce strict protections, just as they do for other sectors.

In Europe, companies providing content moderation services generally offer psychiatric support to moderators (though whether it is adequate is questionable). Besides poor pay, African social media moderators are not even offered this. Tech companies and the outsourcing companies they engage do this deliberately because the sector is not regulated here in Africa. They will take any opportunity to cut costs and it is we moderators who bear the consequences.

As Facebook moderators in the Nairobi centre, we faced hellish conditions from such efficiencies. It was like a prison. At a certain point, when you encounter dangerous, graphic content such as terrorists beheading people or women and children being raped, it’s devastating. After deleting these posts, preventing them from going live and being seen by millions, many times I felt so terrible that I couldn’t continue working. However, I was never allowed to rest. They wanted me to continue sitting at my computer, even after watching such content. We were given limited time for breaks, but bio breaks were the worst. They never allowed us to spend more than two minutes in the washroom. Beyond that, Facebook’s SRT system would mark you as unavailable.³ All the time you were marked as unavailable had to be recovered at the end of the shift.

I remember one day in 2020. After watching a video of a man having sex with a tilapia, it became too much. I deleted the post from the platform and walked away from my computer. When the team leader found me outside, she forced me to work an additional 45 minutes to recover the time. She didn’t listen to my explanations. That day, I worked ten hours and 45 minutes.

No African country has yet regulated how social media moderators should be paid or how the workstations should look. At the centre, we never had any labour protections. Managers could change your shift without consulting you. They expected you to comply or lose your contract. If someone complained, they would be threatened with warning letters; after the third, their contract would be terminated. The warehouse we worked from was originally an export packing house which was hot and poorly constructed. Samasource (Sama) and Facebook made it into an office by splitting it into two levels, with the upper level close to the roof. Imagine sitting there in this hot sub-Saharan region. During that period, I experienced hell.

Many colleagues suffered food poisoning from the food we were served. Sometimes it was too cold, and other times it was half-cooked. Inside the production floors, there were TV screens meant to act as stress relief by playing music or broadcasting football games. We were not allowed to switch them on. In fact, it was an offence to be caught switching them on. In all my three years there, the TVs were kept off. Until African governments prove they care about their people, tech companies will continue to force content moderators to work in inhumane conditions, under exploitative terms, while sustaining potentially life-altering damage for unacceptable pay.

Africa’s content moderators play an outsize role in keeping Africa and the world at large safe online. Many of the Meta moderators I worked with were from countries including Ethiopia, South Africa, Uganda, Burundi, Nigeria, Rwanda, Namibia. They are still in Nairobi and the majority are suffering from mental illness and PTSD as they wait for the Kenyan court to determine their case against Meta (the company that owns Facebook, Instagram, and WhatsApp), and the BPOs Samasource and Majorel Kenya Limited. These moderators are going through a lot, facing numerous difficulties in a foreign country. Without a salary to pay rent, and without medical insurance, some live six to a single-room apartment. Despite informing their representative governments through their Kenyan embassies, none have been offered support or rescue. Our hope still resides in the Kenyan court to determine our case.

Tech companies are only able to mistreat African moderators because not one of our governments has yet laid down clear rules. It is the responsibility of government to ensure that workers are accorded basic rights, such as the right to access appropriate medical services, the right of association, the right to receive proper pay, and the right to decent working conditions. When tech companies operate in Africa, they often fail to fulfil any of these obligations. While our governments expect social media platforms to be safe, through their inaction, the very same governments are failing their people – their citizens, us moderators – who are risking everything to protect the online community.

As content moderators, we are not children of a lesser god. African governments must act now. We are their citizens; we deserve protection.

1. ‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma, Euronews, 19 October 2023. https://www.euronews.com/2023/10/19/it-scars-you-for-life-workers-sue-meta-claiming-viewing-brutal-videos-caused-psychological

2. Meta’s content moderator subcontractor model faces legal squeeze in Spain, The Verge, 19 January 2024. https://techcrunch.com/2024/01/19/meta-subcontractor-ruling-barcelona/

3. When someone on one of Meta’s platforms flags a piece of content, it is sent to the Single Review Tool (SRT) system, for a human moderator to review. Not every piece of content is reported by users. One of the biggest challenges is that content can be reported automatically, by Meta’s AI. This increases the amount of content reported – and while posts can be auto-reported, they can’t be auto-deleted. It takes a human moderator to review every piece of content and determine if it should be removed or not.

Stories for Revolution
Obtrusive Relationships
Gathering Multitudes: A bag of stars
Fugitive Memory: for Tu’i Malila
“The Quizumba is On”: Technological Appropriation by Black Women in the Amazônia
No
Big Green Lies
Letter from the Editors
A guide to the visceral science of time travel
The Unbounded Quest
An interview with Joana Varon
An interview with Jonathan Torres Rodríguez
An interview with futures leader Anab Jain
Where would you like to place your pet giraffe?
Afropresentism – On Incantation and the Machine
Letter from the Editors
A Few Notes on the Cult of Sylphis
Speculative Tourism
Letter from the Editors
Tending to wildness: field notes on movement infrastructure
Aveia, espaçonaves, uma folha de babosa, uma pélvis: fui coletar trechos Oats, spaceships, an aloe leaf, a pelvis: I went to collect parts of the future and decided to turn around.
Προφορικό ποίημα για την προέλευση των Δικτύων Εμπιστοσύνης Narrative Poem about the Origins of Networks of Trust
The Battle to Control the Carbon Media Cycle
Archive of Disappearances
Prototyper la Banlieue du TURFU et transcender la réalité
To Become Undone
Digital artivism: pictures worth thousands of words
Ratios / Proporciónes
Shadow Visions
Letter from the Editor
Future Perfect Continuous
Be Water –  Insights into the Hong Kong protest movement
Care in a techno-capitalist world
HammamRadio, your feminist-love radio station
One Vision, One World. Whose World Then?
Play, imagine, build – the collective verbs
Venezuela – the dual crisis
Letter of the Editor
Terraforms – Or, How to Talk About The Weather
On Persistence: The Past Art/Works of An/Other Future
What the Enlightenment Got Wrong about Computers
Community Learning at Dynamicland
Imagining a Universal Declaration of Digital Rights
An interview with Audrey Tang
Dream Beyond the Wounds
The Blurring
More than HumanCentered Design
The Unpredictable Things
When the Path We Walked Blocks Our Ways Forward
Letter of the Editor
A viewpoint on Craft and the Internet
Who Controls the Internet?
Ethical Tech around the World
Interview with Gillian Crampton Smith
Life & Death
Typographic Craft
The Internet as a Lota
A Medieval Crash
A Gandhian Dream
Evolutionary Craft