Meta Ends Sama Contract, Triggering Mass Layoffs of Over 1,100 Workers in Kenya Amid Ongoing Legal Battle

More than 1,100 workers in Kenya are set to lose their jobs after Meta ended a key outsourcing agreement with Nairobi-based firm Sama, marking a significant shift in the tech giant’s content moderation operations in Africa. The decision comes against the backdrop of an ongoing high-profile lawsuit filed by former moderators who allege exploitative working conditions, raising broader concerns about labor practices in the global digital outsourcing industry.

Sama confirmed it had received formal notice from Meta to terminate a “major engagement” at its Nairobi office, prompting the company to issue layoff notices affecting 1,108 employees. The roles impacted were largely tied to content moderation work previously carried out on behalf of Meta, the parent company of Facebook.

The layoffs represent one of the largest job cuts in Kenya’s outsourcing sector in recent years and underscore the volatility of third-party contracts that underpin much of the global tech ecosystem.

Sama had already begun transitioning away from content moderation services prior to the contract termination, pivoting toward AI data labeling and related digital services. However, the Meta contract remained a substantial component of its Nairobi operations, making the withdrawal particularly consequential.

The development is closely linked to an ongoing legal dispute that has drawn international scrutiny. Since 2022, Meta and Sama have been facing litigation initiated by former content moderators who allege that they were subjected to poor pay, inadequate mental health support, and excessive working hours.

In 2023, around 200 moderators escalated the dispute by filing a lawsuit seeking $1.6 billion in damages. The plaintiffs, many of whom were recruited from across Africa, claim they were exposed to deeply disturbing material including graphic violence and abuse without sufficient psychological support.

Their work involved reviewing posts, videos, and messages to enforce Meta’s community standards, a task that often required prolonged exposure to traumatic content.

Sama has defended its employment practices, stating that it paid wages significantly above Kenya’s minimum standards and provided access to mental health resources, including on-site support. Meta, for its part, has maintained that its contractors are required to meet or exceed industry standards for employee welfare in their respective markets.

Also read: Atlassian Cuts 1,600 Jobs Globally to Accelerate AI Investment and Enterprise Growth

Industry Impact

The termination of Meta’s contract with Sama is likely to reverberate across the global content moderation and outsourcing industry. For years, major tech companies have relied heavily on third-party vendors in lower-cost regions to manage the enormous volume of user-generated content on their platforms.

Kenya has emerged as a regional hub for such operations, offering a combination of skilled labor and cost efficiency. However, the Sama case highlights growing tensions between cost-driven outsourcing models and increasing demands for ethical labor practices.

Industry analysts suggest that tech firms may face mounting pressure to bring more moderation work in-house or to enforce stricter oversight of contractors. The legal and reputational risks associated with outsourcing sensitive tasks like content moderation are becoming harder to ignore.

At the same time, companies specializing in AI data services like Sama’s evolving business model may find new opportunities as demand grows for high-quality labeled datasets to train machine learning systems.

Why This Matters

The layoffs and ongoing lawsuit raise critical questions about accountability in the digital economy. While companies like Meta set platform policies and standards, the enforcement of those rules often falls to outsourced workers who operate far from corporate headquarters.

This separation has created what critics describe as a “hidden workforce” responsible for maintaining online safety while bearing the psychological burden of the job.

The case in Kenya is one of the most prominent legal challenges to this model and could set important precedents for labor rights in the tech outsourcing sector. A ruling in favor of the plaintiffs could reshape how companies structure their global operations, particularly in relation to worker protections and compensation.

What Happens Next

The immediate future for the affected workers remains uncertain. Sama has stated that it is taking steps to support employees through the transition, though details of severance packages or redeployment opportunities have not been fully disclosed.

Meanwhile, the lawsuit continues to move through the courts, with potential implications not just for Sama and Meta, but for the broader tech industry.

If the plaintiffs succeed, the case could trigger a wave of similar claims in other regions where content moderation is outsourced. It may also accelerate regulatory efforts aimed at improving working conditions for digital laborers.

For Meta, the contract termination could signal a strategic recalibration of its moderation operations, possibly involving new vendors, different geographic locations, or increased automation through artificial intelligence.

Background Context

Content moderation has become an essential yet controversial function in the age of social media. Platforms like Facebook process billions of pieces of content daily, requiring large teams to identify and remove material that violates community guidelines.

Outsourcing this work to firms like Sama has allowed tech companies to scale operations efficiently, but it has also exposed workers to significant psychological risks.

The Nairobi hub was one of several global centers handling moderation tasks for Meta, particularly focused on content originating from Africa. Workers were tasked with reviewing some of the most extreme material on the internet, often under tight productivity targets.

The ongoing dispute has brought unprecedented attention to these workers’ experiences, sparking debates about corporate responsibility, labor rights, and the human cost of maintaining digital platforms.