news agency
Kenyan Facebook moderators say their work is ‘not for humans’

Kenyan Facebook moderators say their work is ‘not for humans’

Trevin Brownie hasn’t forgotten his first day as content moderator for Facebookin the premises of a subcontracted company located in the capital of KenyaNairobi.

“My first video was of a man committing suicide. […] Had a 2 or 3 year old playing next door. After the man hanged himself, after about two minutes, he understood that something was wrong ”, says the 30-year-old South African, before describing the boy’s reaction.

“This made me sick,” he says, noting that the images caused him “nausea, vomiting”. “But I continued to do my job,” she adds.

Between 2020 and 2023, every day he watched hundreds of violent videos, with calls to hate, to block them and prevent them from reaching the eyes of Facebook users.

He worked in Nairobi for Sama, a Californian company that Meta – parent company of Facebook, Instagram and WhatsApp – hired between 2019 and 2023 to be in charge of moderating Facebook content in sub-Saharan Africa.

Up to 260 moderators from various African countries have passed through this operations center, hired, above all, for knowing numerous local languages.

Trevin Brownie claims he saw “hundreds of beheadings”, “Organs torn from bodies”, “Rape and child pornography at the highest level”, “child soldiers preparing for war”…

“Humans do things to other humans that I would never have imagined.” says. “People have no idea about morbid videos […] of those who are freed”add.

court battle

Trevin Brownie is a plaintiff in one of the three cases open against Meta and Sama in Kenya.

He has appealed his dismissal along with 183 other former employees of Sama, who announced that he was ceasing his activity in content moderation. The plaintiffs claim compensation for some salaries “insufficient” given “the risk they were exposing themselves to”, and for the “damage caused to his mental health”.

The judicial offensive began when another former worker, Daniel Motaung, filed a lawsuit in May 2022 before a Nairobi court denouncing working conditions “unworthy”deceptive recruitment methods, insufficient remuneration and lack of psychological support.

Testimonies collected at the end of April among the other plaintiffs confirm the facts denounced by Motaung.

Two of them, Amin and Tigist (names have been changed), hired in 2019 in the first group of Sama moderators, claimed that they responded to job offers in call centers.

It wasn’t until they signed their contracts, with confidentiality clauses, that they discovered that they were actually going to work as content moderators.

Amin and Tigist didn’t say anything, nor did they think to leave. “I had no idea what a content moderator is, I had never heard of it”, says Tigist, an Ethiopian who got the job because she knew how to speak the Amharic language.

“Most of us didn’t know the difference between a call center and a content moderation center,” confirms Amin, who worked at the “market” Somali.

“During the training, they downplayed the content. What they showed us was nothing compared to what we would end up seeing”Add. “The problems started later”, says.

Trauma

On their screens, eight hours a day, they were passing content, each one more impressive.

“You don’t choose what you see, it comes by chance: videos of suicides, violence, sexual exploitation of children, nudity, incitement to violence…”, account Amin.

They had to spend an average of 55 to 65 seconds on each video and analyze between 378 and 458 posts a day, at risk of being called to order or fired if they went too slowly, they explain.

Sama claimed that he was not “in conditions” to comment on ongoing cases.

Meta assured that the moderators “In principle they do not have to evaluate a defined number of publications, they do not have quotas and they are not forced to make hasty decisions.”

“We authorize and encourage the companies we work with to give their employees the time they need to make a decision,” added.

None of the three content moderators had imagined the consequences that this job would end up having on their lives.

They did not consult any psychologist or psychiatrist, due to lack of money, but all say they have symptoms of post-traumatic stress syndrome and difficulties relating to family and society.

Trevin Brownie says he has “Children’s fear because of child soldiers”and also from the crowded places “because of all the videos of attacks” what has he seen

“Parties used to drive me crazy”account. “I haven’t been to a club for three years. I can’t, I’m scared”holds.

In the case of Amin, the main effects are seen in his body, which went from 96 kilos to “69 or 70 kg.”

They all claim that they have become impervious to death or terror. “My heart has turned to stone”summarizes Tigist.

“Money was needed”

Meta indicated that it has “clear contracts” with all its partners which include “the availability of individual advice, additional support for those who are exposed to more difficult content”.

“We require all the companies we work with to provide assistance 24 hours a day, 7 days a week, with trained professionals, an on-call service and access to private healthcare from the first day of the contract. ”stressed the company.

According to the complainants, the support proposed by Sama, through “wellness counselors”, did not measure up and was based on vague interviews, without effective follow-up. In addition, they questioned the confidentiality of the sessions.

“It wasn’t useful at all. I’m not saying they weren’t qualified, but I think they weren’t qualified enough to manage people who moderate content.” considers Amin.

Despite their traumas, they kept working because they “That money was needed.”

With a salary of 40,000 shillings (270 euros, $287) and an additional 20,000 shillings for non-Kenyans, they earned almost triple the national minimum wage (15,200 shillings).

“Since 2019, I have not had the opportunity to find another job, despite having applied to many. I had no choice. That’s why I stayed so long.”Amin explains.

“Sacrifice”

To hold out, moderators must find “defense mechanisms”, explains Trevin Brownie.

Some turn to drugs, such as cannabis, interviewees say.

The South African, who used to love comedies, turned to scary movies. “It was a way of escaping from my reality”he says, explaining that he also developed an “addiction” to violent images.

“But one of our main defense mechanisms is that we are convinced of the importance of this work,” Add.

“I had the impression that he was hurting me but for a good reason, […] that the sacrifice was worth it for the good of society.”

“Without us, social networks cannot exist”, adds. “No one would open Facebook if it was full of shocking content, drug sales, blackmail, harassment…”.

“We deserve to be treated better”

“This causes damage and we sacrifice ourselves for our community, for the world. We deserve to be treated better.” Tigist points out.

None would dedicate themselves to it again.

“My opinion is that no human should do that. It’s not a job for humans.”explains Trevin Brownie. “Frankly, I wish artificial intelligence could do that job.”

But despite the enormous strides that have been made, Trevin Brownie doubts that will be possible in the near future.

“Technology plays and will continue to play a central role in our content verification operations”Meta assured.

Until now, no one had told anyone about their work, not even their family, because of the confidentiality clauses and also because “No one can understand what we live through.”

“If people find out, for example, that I have seen pornography, they will judge me”explains Tigist.

She told her husband little about what she did. From her children, he hid everything: “I don’t want them to know what I’ve done. I don’t even want you to imagine what I’ve come to see.”says.

Source: AFP

Source: Gestion

You may also like

Hot News

TRENDING NEWS

Subscribe

follow us

Immediate Access Pro