+++ Content Warning: This story contains themes of Graphic/Explicit Violence, particularly towards women* and BIPOC • Self Harm & Suicide (Including suicidal thoughts) • Rape, Sexual Assault & Harassment.+++

I am a Nigerian content moderator who worked for Meta, moderating its Facebook platform, at the BPO Sama, in Nairobi, Kenya.

I started work on October 19th, 2020, having quit my previous job back in Nigeria to take up the offer. When I got to Kenya I was accommodated for a few weeks in Nyayo Estate with a few others from different countries, after which everyone was allowed to find a place for themselves. Throughout my stay in Kenya, I have lived as a local, without proper immigration status or documentation, which has led to police harassment. I can’t move freely in town for fear of being stopped by the police, harassed or even locked in a cell. I have lived this way since arriving here.

I wasn’t made aware of the nature of the job until our training at Sama’s Nairobi office. It became clearer to me when I began to work on live events and content posted by users on the Facebook platform. The content involved videos and images of human mutilation, suicide, adult sexual activities, child abuse, bullying and harassment, hate speech, violence, murder, rape and more. As a content moderator, I became the gatekeeper who acted as a buffer between the platform’s users and the horrific content posted online. To keep users safe, I received its full impact.

My job was to review and monitor all user-generated content and ensure it complied with Meta’s community standards and guidelines. In my three years working there I engaged with this content daily to keep it from Facebook’s users and the general public. For such a tough job I expected a comfortable, caring and friendly working environment, an environment where I was respected for what I did, where my well-being and mental health was prioritised. Unfortunately, this was not the case. I was pressured to meet daily, weekly and monthly targets, and never had enough time to rest and refresh my mind, away from the toxic content.

There were no mental health evaluations or psychiatric check-ups. At times I couldn’t see a counsellor when I needed to – I would be told to wait until the day scheduled by the team lead. None of these counsellors were qualified. Worst of all, when you were sick you were expected to report to work unless you got a doctor’s sick note. I remember one particular day, I was very sick but still had to go to work to see the office nurse who could refer me to a doctor. When I got there, the sickness got worse, but the team lead didn’t allow me to go to the hospital until the nurse insisted. We were prevented from going to any hospitals except those assigned by the company.

In my view, the working environment was a centre of exploitation. I say so from my own experience. Compared to other moderators globally, we are underpaid here in Kenya. Moreover, while I was working at Sama, I couldn’t complain or raise concerns because I would be ignored or threatened with being fired. As one who left his home country, without another job to fall back on, I remained silent in spite of the horror. When clients visited, we were not allowed to speak to them. Instead, management hand-picked a few people and told them what to say. I found out later that a camera was installed in the room to monitor conversations with the client.

One special evil of working as a content moderator was the non-disclosure agreement, or NDA. Because of it, I was told I could not talk with anyone about my job – not even to a doctor or family member. The NDA was used to threaten us legally, and keep us from speaking out despite the deadly and horrific circumstances we faced.

Some may ask, why didn’t I resign? I thought of it, but I had left my previous job back home for this, and getting another job would be difficult. I felt I had no choice but to continue. A lot of thoughts ran through my mind, but there was nothing I could do. It was never easy but I had to face the job’s horror to raise my young family. The decision was painful and tough, but I had to.

As moderators we were restricted from having a union. Mr Daniel Motaung, a South African content moderator who worked at Sama, came up with the idea of a union for content moderators. He was then fired. After that, nobody dared talk about unionisation for fear of losing their job. On March 31st, 2023, all of us content moderators working at Sama for Meta were laid off. It happened without a clear and proper explanation as to why. They called it redundancy but I say it was unlawful because my contract was to be renewed on October 19th, 2022. Instead, it was extended to February 2023. I remember asking the HR admin why, but I was told not to worry; the company just wanted to have everyone’s contract renewed at the same point.

By that point I had brought my wife to live with me in Kenya, so it was a shock. If I had known before, I wouldn’t have spent such a huge sum of money bringing her over. At this point there was nothing any of us could do. We had a series of meetings with management concerning the monetary aspect of the layoff package, but they had no good response so we took the legal route. As content moderators, we’ve been exploited and our mental health affected, yet we were laid off without proper care or compensation. We’ve only been able to take the bold step towards unionising due to the litigation. Our aim is to bring all moderators under a protection that ensures workplace safety and their legal rights. In the near future we look forward to having the union as a global voice, not just a national or regional one. We place our faith in the Kenyan courts.

The case is going through the Kenyan employment and labour relations court, so anyone can follow the process. Another consequence of these layoffs was that all of us from Sama were blacklisted for work. For instance, I and most of my colleagues applied to do content moderation at Majorel, another BPO, but none of us were employed. Some were even told by the recruiter they would not get the job because they are former Sama moderators.

We went to court fighting for our human and labour rights. As foreigners coming from different African countries (including Nigeria, South Africa, Uganda, Ethiopia and Burundi) we’ve been exploited by these foreign Big Tech companies that use BPOs to their own advantage. As such, we are asking for justice. My mental health has been badly and irreversible affected; I need serious psychiatric help, as well as compensation for my well-being now and in the near future. One consequence of this job has been to keep me in a state where I don’t see things the same way others do. My emotions, reasoning and ways of life have been totally changed, influenced negatively by the horrific content.

Waiting for the court’s judgement is hard, too. Meta and Sama have not followed the court’s orders that the redundancy be placed on hold, that we remain on the payroll, and that our immigration status be regularised until the court hearing. After several court hearings we were advised to go for mediation and settle out of court, but what Meta and Sama offered was totally unreasonable compared to the damage inflicted on the mental health of us content moderators. So, for now, the legal proceedings are ongoing.

We trust the court to give us a fair hearing and judgment. As an affected content moderator, I call on the Kenyan government to see that justice prevails. I also call on African leaders to say ‘No’ to the exploitation of African youth by Big Tech companies. It’s obvious they come to Africa in search of cheap labour, and after using a set of moderators for a while, they dump them and search for fresh hires. If they notice their operations in a particular country are questioned by the law, they move to another African country where they can continue exploiting the African youth. They are taking advantage of these vulnerabilities. For Big Tech companies to operate in Africa, they should (among other things) provide: a good working environment, favourable working conditions, comply with workers’ and labour rights, pay good equal wages to all content moderators globally, provide mental health evaluations and support to all moderators, allow free unionisation, and provide mental health life insurance and compensation after contract termination.

I will end with advice to anyone aspiring to work for a tech company like Meta or TikTok. I was ignorant when I was asked to sign a contract without being given the time to read through the documents. This was an evil trick to keep me silent throughout these years of horror. Demand transparency in the contract and seek independent legal advice and guidance. If you must work for Big Tech, seek clarity on your benefits and entitlements. Because of my horrific experience as a content moderator, I am now an advocate for fair working conditions for all online platform workers. I and my colleagues will go the extra mile to obtain justice for moderators who have already been harmed, and advocate for the protection of the rights of all online platform workers, now and in the future.

Letter from the editors
Interview: The hidden toll of women in content moderation
A fight for generations
Visions of the unseen architect
Stories for Revolution
Obtrusive Relationships
Gathering Multitudes: A bag of stars
Fugitive Memory: for Tu’i Malila
“The Quizumba is On”: Technological Appropriation by Black Women in the Amazônia
No
Big Green Lies
Letter from the Editors
A guide to the visceral science of time travel
The Unbounded Quest
An interview with Joana Varon
An interview with Jonathan Torres Rodríguez
An interview with futures leader Anab Jain
Where would you like to place your pet giraffe?
Afropresentism – On Incantation and the Machine
Letter from the Editors
A Few Notes on the Cult of Sylphis
Speculative Tourism
Letter from the Editors
Tending to wildness: field notes on movement infrastructure
Aveia, espaçonaves, uma folha de babosa, uma pélvis: fui coletar trechos Oats, spaceships, an aloe leaf, a pelvis: I went to collect parts of the future and decided to turn around.
Προφορικό ποίημα για την προέλευση των Δικτύων Εμπιστοσύνης Narrative Poem about the Origins of Networks of Trust
The Battle to Control the Carbon Media Cycle
Archive of Disappearances
Prototyper la Banlieue du TURFU et transcender la réalité
To Become Undone
Digital artivism: pictures worth thousands of words
Ratios / Proporciónes
Shadow Visions
Letter from the Editor
Future Perfect Continuous
Be Water –  Insights into the Hong Kong protest movement
Care in a techno-capitalist world
HammamRadio, your feminist-love radio station
One Vision, One World. Whose World Then?
Play, imagine, build – the collective verbs
Venezuela – the dual crisis
Letter of the Editor
Terraforms – Or, How to Talk About The Weather
On Persistence: The Past Art/Works of An/Other Future
What the Enlightenment Got Wrong about Computers
Community Learning at Dynamicland
Imagining a Universal Declaration of Digital Rights
An interview with Audrey Tang
Dream Beyond the Wounds
The Blurring
More than HumanCentered Design
The Unpredictable Things
When the Path We Walked Blocks Our Ways Forward
Letter of the Editor
A viewpoint on Craft and the Internet
Who Controls the Internet?
Ethical Tech around the World
Interview with Gillian Crampton Smith
Life & Death
Typographic Craft
The Internet as a Lota
A Medieval Crash
A Gandhian Dream
Evolutionary Craft