Fixing one of tech’s most inhumane jobs is not an enviable task.
Moderators have to dig through some of the darkest content humanity has to offer — and the work is often traumatizing. But AI moderation isn’t quite up to snuff either, which means humans are an unfortunate necessity.
But in a new report, NYU professor Paul M. Barrett has a few ideas for revamping the industry — starting with one big one. Instead of outsourcing content moderators, he argues, companies like Facebook need to bring them on as full-time workers.
How content moderation went so wrong
Big tech companies currently “marginalize the people who do content moderation” to give themselves “plausible deniability” over content failures, according to the report.
Facebook’s content moderators are often subcontractors. Many worked for a company called Cognizant, earning salaries as low as ~$28k and receiving few health benefits. (Last October, Cognizant left the business.)
After a string of moderators received PTSD diagnoses, they sued Facebook — and last month, the company agreed to shell out $52m to 11,250 of its moderators.
But moderators say they are blocked from “voicing concerns and contributing to the public discussion.” A group of moderators on Monday expressed solidarity with virtual walkouts at Facebook, writing, “We would walk out with you — if Facebook would allow it.”
The new moderation action plan
Here are a few more of Barrett’s prescriptions:
- Double the number of human moderators. Moderators will be able to rotate shifts more often and overall view less terrifying content.
- Put moderation teams in every country. Moderators trained in local languages and politics are best suited to sleuth out inappropriate posts.
- Give moderators more mental health support. Trauma counseling is urgently needed — but most have a tricky time getting it.