Facebook moderators demand Covid-19 safety protections

Contract workers of the social media giant petition Facebook to "keep moderators and their families safe" by maintaining remote work as much as possible and offering "hazard pay" to those who come into the office.

A 3D-printed Facebook logo is seen on a keyboard in this illustration from March 25, 2020.
Reuters

A 3D-printed Facebook logo is seen on a keyboard in this illustration from March 25, 2020.

More than 200 Facebook content moderators have demanded better health and safety protections as the social media giant called the workers back to the office during the pandemic.

A petition signed by the contract workers living in various countries said Facebook should guarantee better conditions or allow the workers to continue their jobs from home.

"After months of allowing content moderators to work from home, faced with intense pressure to keep Facebook free of hate and disinformation, you have forced us back to the office," said the open letter released by the British-based legal activist firm Foxglove.

READ MORE: Big Tech expect solid earning despite politics

Loading...

Keep them safe

The letter called on Facebook to "keep moderators and their families safe" by maintaining remote work as much as possible and offering "hazard pay" to those who do come into the office.

When the pandemic hit, Facebook sent home most of its content moderators, those responsible for filtering violent and hateful images as well as other content which violates platform rules.

But the social platform discovered limits on what remote employees could do and turned to automated systems using artificial intelligence, which had other shortcomings.

"We appreciate the valuable work content reviewers do and we prioritize their health and safety," a Facebook spokesperson said in a statement to AFP.

"The majority of these 15,000 global content reviewers have been working from home and will continue to do so for the duration of the pandemic," the spokesperson said.

Human moderators

The workers' letter said the current environment highlights the need for human moderators.

"The AI wasn't up to the job. Important speech got swept into the maw of the Facebook filter, and risky content, like self-harm, stayed up," the letter said.

"The lesson is clear. Facebook's algorithms are years away from achieving the necessary level of sophistication to moderate content automatically. They may never get there."

The petition said Facebook should consider making the moderators full employees, who in most cases may continue working remotely through mid-2021.

"By outsourcing our jobs, Facebook implies that the 35,000 of us who work in moderation are somehow peripheral to social media," the letter said, referring to a broader group of moderators that includes the 15,000 content reviewers.

"Yet we are so integral to Facebook's viability that we must risk our lives to come into work."

READ MORE: Facebook seeks to block NWU's ad-targeting data tool

Route 6