The Hustle

An NYU report offers a plan for the content moderation industry

The job is a nightmare. But tech companies might be able to make it a little more humane.


June 9, 2020

Fixing one of tech’s most inhumane jobs is not an enviable task.

Moderators have to dig through some of the darkest content humanity has to offer — and the work is often traumatizing. But AI moderation isn’t quite up to snuff either, which means humans are an unfortunate necessity.

But in a new report, NYU professor Paul M. Barrett has a few ideas for revamping the industry — starting with one big one. Instead of outsourcing content moderators, he argues, companies like Facebook need to bring them on as full-time workers.

How content moderation went so wrong

Big tech companies currently “marginalize the people who do content moderation” to give themselves “plausible deniability” over content failures, according to the report.

Facebook’s content moderators are often subcontractors. Many worked for a company called Cognizant, earning salaries as low as ~$28k and receiving few health benefits. (Last October, Cognizant left the business.)

After a string of moderators received PTSD diagnoses, they sued Facebook — and last month, the company agreed to shell out $52m to 11,250 of its moderators.

But moderators say they are blocked from “voicing concerns and contributing to the public discussion.” A group of moderators on Monday expressed solidarity with virtual walkouts at Facebook, writing, “We would walk out with you — if Facebook would allow it.”

The new moderation action plan

Here are a few more of Barrett’s prescriptions:

Daily briefings, straight to your inbox

Business and tech news in 5 minutes or less

Join over 1 million people who read The Hustle

Exit mobile version