Why Cooperatives for Content Moderators and Tech Workers in Africa?

Content moderation is one of the least understood, yet most critical, jobs on the internet. We are the unseen workers behind platforms like Facebook, TikTok, Instagram, ChatGPT, and many others. Our job is to filter and review the content users post on social media. While most users scroll through their feeds, we are the ones ensuring that harmful, toxic, or illegal content like hate speech, graphic violence, and even suicide videos are flagged and removed.

The Challenges: Isolation, Secrecy, Pressure, and Exposure to Toxic Content

For years, I worked as a content moderator for Facebook, processing hundreds of posts daily. I spent four years in this role, which was shrouded in secrecy due to a non-disclosure agreement we were required to sign just a few days after being hired by Sama, an outsourcing company for Facebook (now Meta). A few of us new workers were put in a room, given the agreements, and made to sign and return them within minutes. One clause I remember in the NDA stated I could face up to 20 years in jail if I ever disclosed that I was working for Facebook. I wasn’t even allowed to talk about what I did. When people asked, I would tell them I was a translator or worked in customer service because content moderation was something we were not supposed to discuss. This isolation and secrecy made it difficult for workers like me to organize or share our experiences and concerns.

This was one of the first major challenges we faced. We had no way to connect with other workers or society to talk about the emotional toll of the job or to discuss the unfair conditions we were working under. Without the ability to organize, it felt like we were just cogs in a machine, disconnected from any sense of community or support.

The Need for Change: Addressing the Mental Health Toll and Workers’ Rights

Things began to change when the project I worked on was declared redundant in March 2023. That was when we were finally able to talk openly. With legal support, we started realizing we could organize and form associations to advocate for our rights. In this process, we tried to create the African Content Moderators Union, a union to represent workers like us. But forming a union has not been easy. Big tech companies and the governments they work with often claim that such unions might scare off investments or harm the reputation of the companies involved.

Additionally, older worker associations often struggle to understand the specific needs of platform workers. Their focus is usually on traditional forms of employment, and they don’t always know how to handle the unique challenges faced by tech workers. It can be frustrating because the world of platform work, especially in Africa, is new, and we are still figuring out how to navigate it. In Europe, the USA, and other parts of the world, content moderators are paid more than $20 an hour. In Kenya, we were paid $1.50 for the same job and platform.

This is where cooperatives come in. Cooperatives offer a promising alternative for content moderators and other tech workers in Africa to organize and build solidarity. A cooperative allows workers to come together, build collective power, pool resources, and support each other. However, challenges remain. Many workers fear joining a cooperative could jeopardize their jobs. The pressure to stay isolated and individualistic is strong, especially when working long hours in stressful conditions.

The Reality of Content Moderation Work

Content moderation is not an easy job. We work on tight deadlines and have to review staggering amounts of content. For example, when you post something on Facebook or TikTok, we are the ones who review it. If it’s flagged as inappropriate, we take it down to maintain the platform’s community standards. We do this behind the scenes, often without recognition. Many people mistakenly think it’s all done by AI, but it’s actually real people—content moderators—working at high speed, sometimes reviewing up to a thousand pieces of content per day. The Facebook KPI states we must review and submit one piece of content every 50 seconds.

The content we review is often distressing. Imagine repeatedly watching suicide videos, violent crime scenes, terrorism propaganda, or graphic pornography—every day. That’s what content moderators are exposed to. It’s mentally and emotionally exhausting. Many of us develop mental health problems such as PTSD, anxiety, depression, and insomnia due to the nature of the work. I’ve personally struggled with panic attacks, flashbacks, and even suicidal thoughts. Even though I’m trained as a clinical psychologist, I found I couldn’t easily handle the toll this job took on my mental health.

Because I might watch 10, 15, or even 100 suicide videos in a day, I can’t prepare for what I’ll encounter. I dress up, go to work, and then might see 30 or 50 suicide videos that day. Doing that for a long time changes your perspective on life and how you see humanity. Platform work is necessary, but at what expense?

As we push for cooperatives and digitalization, we must also think about the harm this work can cause. What plans or solutions do we have for dealing with these harms? In the next 5 to 10 years, most informal work will likely become digitalized. How can we navigate the negative impacts on workers’ mental health and well-being while still moving forward?

The Invisible Trauma of the Tech Industry

The emotional impact of this work is so severe it’s often referred to as the “invisible trauma” of the tech industry. As workers, we face immense pressure, not just from the workload but from the psychological damage caused by repeatedly viewing harmful content. We often suffer in silence because many of us don’t even know how to talk about it or feel comfortable sharing our struggles with others who don’t understand the nature of the work.

Why Cooperatives?

Cooperatives are a way for workers to organize, build collective power, and create financial and emotional support systems. By forming a cooperative, we can: 1) Build financial capacity, 2) Support each other’s mental health, 3) Create a collective voice to push for better workplace policies.

However, many workers don’t yet understand their rights or fear losing their jobs if they speak out. Many also feel isolated and exhausted, leaving little energy to get involved in organizing. But platform work is here to stay and will continue growing in Africa. As more work becomes digitalized, we must find ways to protect ourselves and others from its harms.

I am particularly passionate about researching the mental health of content moderators and tech workers in Africa and finding ways to mitigate the psychological damage this work causes. Unfortunately, I had to drop out of my Ph.D. studies due to financial constraints and mental health struggles. But I am determined to continue this work, and I believe that through cooperatives, we can begin to build solutions grounded in African realities that will help us navigate this difficult but necessary work. As we explore the potential of cooperatives, we must also address the harmful impacts of platform work. We need to ensure that as we push for more digitalization and online work, we are not leaving workers behind or subjecting them to further harm. This is not just about building power and financial security but about protecting our health, human rights, dignity, and well-being.