Entertainment
Meta and Sama face legal action in Kenya for alleged poor work conditions
Meta and Sama, its main subcontractor for content moderation in Africa, are facing a lawsuit in Kenya over alleged unsafe and unfair working conditions if they fail to meet 12 demands on workplace conditions brought before them.
Nzili and Sumbi Advocates, the law firm representing Daniel Motaung, a former Sama employee who was laid off for organizing a strike in 2019 over poor working conditions and pay, in a demand letter seen by TechCrunch, accused the subcontractor of violating various rights, including that of health and privacy of Kenyan and international staff.
Motaung was allegedly laid off for organizing the strike and trying to unionize Sama employees. The law firm has given Meta and Sama 21 days (starting Tuesday, March 29) to respond to the demands or face a lawsuit.
In the demand letter, the law firm asked Meta and Sama to adhere to the country’s labor, privacy and health laws; recruit qualified and experienced health professionals; and provide the moderators with adequate mental health insurance and better compensation.
“Facebook subcontracts most of this work to companies like Sama – a practice that keeps Facebook’s profit margins high but at the cost of thousands of moderators’ health – and the safety of Facebook worldwide. Sama moderators report ongoing violations, including conditions which are unsafe, degrading, and pose a risk of post-traumatic stress disorder (PTSD),” Motuang’s lawyers said.
The imminent suit follows a Time story that detailed how Sama recruited the moderators under the false pretext that they were taking up call center jobs. The content moderators, hired from across the continent, the story said, only learned about the nature of their jobs after signing employment contracts and relocating to its hub in Nairobi.
The moderators sift through social media posts on all its platforms, including Facebook, to remove those perpetrating and perpetuating hate, misinformation and violence.
Among the many requirements employees are expected to abide by is not disclosing the nature of their jobs to outsiders. The content moderators’ pay in Africa, the article said, is the lowest across the globe. Sama fashions itself as an ethical AI firm. The firm increased employee pay after the exposé.
The law firm alleged that Sama failed to grant Motaung and his colleagues adequate psychosocial support and mental health measures, including “unplanned breaks as needed particularly after exposure to graphic content.” The productivity of Sama’s employees was also tracked using Meta’s software — to measure employee screen time and movement during work hours. Sama granted them “thirty minutes a day with a wellness counselor.”
“Sama and Meta failed to prepare our client for the kind of job he was to do and its effects. The first video he remembers moderating was of a beheading. Up to that point, no psychological support had been offered to him in advance,” said the law firm.
Sama, in a post published after the exposé, denied the any wrongdoing stating that it is transparent during its hiring process and has a culture that “prioritizes employee health and wellness.
“We understand that content moderation is a difficult but essential job to ensure the safety of the internet for everyone, and it’s why we invest heavily in training, personal development, and wellness programs,” said Sama.
“As a global technology company, partner and employer, we take pride in our responsibility to be transparent and honest. It is completely inaccurate to suggest that Sama employees were hired under false pretenses or were provided inaccurate information regarding content moderation work.”
Mercy Mutemi, who is leading the legal action, said, “I use Facebook, like many Kenyans, and it’s an important place to discuss the news. But that is why this case is so important.”
“The very safety and integrity of our democratic process in Kenya depends on a Facebook — that is properly staffed and where content moderators, the front-line workers against hate and misinformation, have the support they need to protect us all. This isn’t an ordinary labor case – the working conditions for Facebook moderators affect all Kenyans.”
A Meta spokesperson, in a response to questions sent by TechCrunch, said, “We take our responsibility to the people who review content for Meta seriously and require our partners to provide industry-leading pay, benefits and support. We also encourage content reviewers to raise issues when they become aware of them and regularly conduct independent audits to ensure our partners are meeting the high standards we expect of them.”
This article has been updated to include Meta’s response.
-
Entertainment7 days ago
Earth’s mini moon could be a chunk of the big moon, scientists say
-
Entertainment7 days ago
The space station is leaking. Why it hasn’t imperiled the mission.
-
Entertainment6 days ago
‘Dune: Prophecy’ review: The Bene Gesserit shine in this sci-fi showstopper
-
Entertainment5 days ago
Black Friday 2024: The greatest early deals in Australia – live now
-
Entertainment4 days ago
How to watch ‘Smile 2’ at home: When is it streaming?
-
Entertainment3 days ago
‘Wicked’ review: Ariana Grande and Cynthia Erivo aspire to movie musical magic
-
Entertainment2 days ago
A24 is selling chocolate now. But what would their films actually taste like?
-
Entertainment3 days ago
New teen video-viewing guidelines: What you should know