Facebook Sued by Former Kenyan Content Moderator

A person who says he’s “destroyed” after working as a content material moderator for Facebook has filed a lawsuit accusing the corporate of human trafficking Africans to work in an exploitative and unsafe facility in Kenya.

The case towards Meta Platforms, the Menlo Park, Calif. firm that owns Facebook, and Sama, a San Francisco subcontractor, was lodged Tuesday with a courtroom within the Kenyan capital, Nairobi.

Daniel Motaung’s petition “calls upon Kenya’s courts to order Facebook and its outsourcing companies to end exploitation in its Nairobi moderation hub, where content moderators work in dangerous conditions,” mentioned an announcement by Foxglove, a London-based authorized nonprofit that helps Facebook content material moderators.

The first video Motaung watched as a Facebook moderator was a video of somebody being beheaded, he informed reporters throughout a name Tuesday. He stayed on the job for roughly six months, after relocating from South Africa to Nairobi in 2019 for the work. Motaung says he was dismissed after attempting to spearhead efforts to unionise on the facility.

Motaung mentioned his job was traumatising and he now has a concern of demise.

“I had potential,” Motaung mentioned. “When I went to Kenya, I went to Kenya because I wanted to change my life. I wanted to change the life of my family. I came out a different person, a person who has been destroyed.”

Motaung says in his submitting that when he arrived in Kenya for that work, he was informed to signal a non-disclosure settlement and his pay was lower than promised, with one month-to-month paycheck that was KES 40,000, or roughly $350 (roughly Rs. 27,000).

The lawsuit notes that Sama targets individuals from poor households throughout Kenya, South Africa, Ethiopia, Somalia, Uganda and different nations within the area with “misleading job ads” that fail to reveal that they are going to be working as Facebook content material moderators or viewing disturbing content material that expose them to psychological well being woes.

Applicants are recruited “through deceit,” mentioned Mercy Mutemi, who filed the petition in courtroom Tuesday morning. “We discovered a variety of Africans had been compelled into drive labour conditions and human trafficking. When you permit your nation for a job that you just did not apply for, that quantities to human trafficking.”

Content moderators are not given enough medical coverage to seek mental health treatment, the filing alleges.

The lawsuit also seeks orders for Facebook and Sama to respect moderators’ right to unionise.

Meta’s office in Nairobi said it takes seriously its responsibility to people who review content for the company and requires its “partners to provide industry-leading pay, benefits and support,” according to a statement issued by the company’s spokeswoman.

”We also encourage content reviewers to raise issues when they become aware of them and regularly conduct independent audits to ensure our partners are meeting the high standards we expect of them,” the assertion mentioned.

In 2020, Facebook agreed to pay $52 million (roughly Rs. 401 crore) to US content material moderators who filed a category motion lawsuit after they had been repeatedly uncovered to beheadings, baby and sexual abuse, animal cruelty, terrorism and different disturbing content material.

Sama, which describes itself as an moral AI firm, didn’t instantly present remark.

Sama’s Nairobi location is the biggest content material moderation facility in Africa, with roughly 240 workers engaged on the trouble, in accordance with the submitting.

“We are not animals,” Motaung mentioned within the assertion. “We are individuals — and we should be handled as such.”


#Facebook #Sued #Kenyan #Content #Moderator