Nathan Nkuzimana worked for a Kenyan company called Sama until early 2023.
Sama is a technology company that sold content moderation services to Facebook. Content moderators look at material posted to social media websites such as Facebook for quality control.
The 33-year-old Nkuzimana came to Kenya from Burundi to work for Sama. He said he looked at violent and sexual images and videos every workday.
He said he saw a child being sexually violated in one post and a woman being killed in another. Nkuzimana said seeing such horrors damaged his mental health.
Nkuzimana is not alone. He joined about 200 workers from Kenya who are part of a larger group in Africa suing Sama and Facebook. They say they suffered through harmful working conditions.
It is the first known international legal case related to Facebook content moderators. In 2020, Facebook settled a legal action against it brought by content workers in the U.S.
The Sama workers did their jobs at a Nairobi office. They looked at material that came from Facebook users in Africa. Their job was to remove harmful or illegal posts.
The workers in Africa are seeking $1.6 billion in compensation for their work. They say their employers did not pay them enough nor provide them enough mental health support. They also say Sama and Facebook should continue to pay them while courts consider the case.
Facebook and Sama both deny the workers’ accusations.
A Kenyan court is hearing the case.
Many of the workers came to Kenya from other countries because Sama paid well. Some earned $429 per month and workers from other countries, including Nkuzimana, made a little more.
People who follow technology news say the group in Kenya is the most visible because they are also pushing to form a workers’ group, or union.
While the case is being decided, however, the workers are not being paid. Their work permits have time limitations also.
Away from home
Many of the workers left their home countries because they heard about good pay. But they also wanted to leave because of conflicts at home. Fasica Gebrekidan is from Ethiopia. She left home for Kenya because she did not want to get caught up in her country’s civil war in the northern Tigray region.
She knew of the bad things happening in her country. When she started working for Sama, she said she saw frightening images and videos. In order to make a decision about a video, she would have to watch the first 50 seconds and the last 50 seconds. She would see images of war and rape.
“You run away from the war, then you have to see the war,” Fasica said. “It was just a torture for us.”
Many of the moderators said they started the work at Sama with good feelings. Nkuzimana said he and his co-workers felt like “heroes to the community.” He went on to say that people feel safe looking at Facebook because of workers like him. He compared the workers to soldiers who might be hurt so everyone else can be safe.
But those good feelings turned bad after hours watching harmful material. Nkuzimana said he would come home and close himself in his room so he would not have to speak with his family about what he saw that day.
The workers said the U.S.-based Sama did not help the moderators work through what they saw. The company, however, said mental health professionals were available to all employees.
Sarah Roberts is an expert in content moderation at the University of California, Los Angeles. She said workers might risk their mental health for a chance to work in technology and make good money.
When companies like Sama are hired to do work for Facebook, Roberts, explained, it permits Facebook to say the workers are not their employees. In addition, she said, the workers are telling “the story of an exploitative industry.”
Forever in their heads
Fasica said she is worried she will never be able to have a normal life. She always sees the images in her head. She called it “garbage” and worried it will be in her head forever.
She said Facebook should know what is going on with the workers. “They should care about us,” she said.
I’m Dan Friedell. And I’m Caty Weaver.
Dan Friedell adapted this story for Learning English based on a story by The Associated Press.
_____________________________________________________________________
Words in This Story
sue –v. to start a legal action because you think you have been hurt by someone else
compensation –n. another word for payment
visible –adj. well-known
______________________________________________________________________
We want to hear from you. Did you know so many people posted offensive material on Facebook?
Here is how our comment system works:
- Write your comment in the box.
- Under the box, you can see four images for social media accounts. They are for Disqus, Facebook, Twitter and Google.
- Click on one image and a box appears. Enter the login for your social media account. Or you may create one on the Disqus system. It is the blue circle with “D” on it. It is free.
Each time you return to comment on the Learning English site, you can use your account and see your comments and replies to them. Our comment policy is here.
Forum