The Dark World of Social Media Moderators

(Nairobi) – Social media platforms rely on thousands of unseen moderators who spend hours reviewing disturbing and often illegal content to protect online users. This army of moderators, based worldwide and often employed through third-party firms, encounters distressing material, including violent videos, abuse, and hate speech, daily. As social media continues to expand, companies face growing pressure to remove harmful content rapidly. Yet, despite advances in AI technology, human moderators are still crucial to filtering online content, though the toll on their mental health is profound.

Over the past months, BBC journalists have investigated this unseen side of social media through a series titled The Moderators on BBC Radio 4 and BBC Sounds. They explored the world of social media moderators, speaking to former employees across East Africa, many of whom left the industry due to trauma experienced from their work. Moderators described reviewing hundreds of shocking and painful videos, such as scenes of violent acts, abuse, and disturbing images.

Many moderators, including a former TikTok moderator in Nairobi named Mojez, revealed the mental health impacts of their roles. Mojez spoke about viewing hundreds of “horrific and traumatizing” videos. “I took it upon myself,” he said, explaining that he sacrificed his mental well-being so that general users could enjoy a safe online experience. Stories like his underscore the heavy psychological burden moderators carry, and some have since united to advocate for mental health resources and legal protections for their field.

These workers, seen as a “first line of defense” on platforms like Facebook, Instagram, and TikTok, often view themselves with a sense of pride in their protective role. Some moderators even likened themselves to emergency responders, with one describing the job as vital as that of a paramedic. For one moderator, whom the BBC referred to as David, the role was particularly meaningful because he contributed to training AI systems. David felt pride in his work training AI models to recognize harmful content for platforms like ChatGPT.

As technology evolves, AI tools are taking on a greater share of the moderation burden. For example, OpenAI, the developer of ChatGPT, has integrated AI tools to detect harmful content with up to 90% accuracy. Dave Willner, former head of trust and safety at OpenAI, reflected on how AI moderation could complement human efforts, noting that AI “doesn’t get bored, tired, or shocked.”

Despite these improvements, experts like Dr. Paul Reilly, a senior lecturer in media and democracy at the University of Glasgow, caution against relying entirely on AI. Reilly believes that AI can be overly restrictive, possibly suppressing free speech, and may lack the nuanced understanding that human moderators bring to the role. “Human moderation is essential to platforms,” he said, though he acknowledges the severe impact on those who work as moderators.

Efforts to support moderator well-being vary by platform. TikTok, Meta (parent company of Facebook and Instagram), and OpenAI have each stated that they prioritize the health of their workers. TikTok has implemented clinical support and wellness programs, while Meta mandates that third-party companies provide 24-hour on-site mental health support. OpenAI expressed gratitude to its moderators and reported following strict policies for their protection.

The toll on moderators, however, continues to draw global attention. In 2020, Meta settled a lawsuit for $52 million with a group of U.S.-based moderators who had developed mental health conditions from the distressing nature of their work. Initiated by Selena Scola, a former moderator, the legal action highlighted the psychological challenges moderators face. Scola described them as “keepers of souls,” referencing the often heartbreaking content they review.

The stories shared in The Moderators series offer a glimpse into the daily sacrifices these individuals make to keep online spaces safe. Many former moderators still grapple with trauma, finding it difficult to sleep, eat, or interact socially. Some East African moderators have formed a union to press for safer work conditions and better mental health resources.

For the full investigation, listen to The Moderators on BBC Radio 4 from November 11 to 15 or on BBC Sounds.