Photo by Ilana Panich-Linsman for The Washington Post via Getty Images
Back in 2007, a settlement with the state of New York required Facebook to remove pornographic posts within 24 hours.
At first, Facebook tasked its own employees with the job — but they were quickly overwhelmed and the company turned to automation and outsourcing.
Today, 90%+ of flagged content is removed by AI…
… while the other 10% falls on human moderators.
Accenture took on the work when Facebook moved on from its original content moderation partner.
The scope of the agreement escalated quickly:
- The moderation team grew from 300 to 3k workers between 2015 and 2016, and is now at ~5.8k workers
- Accenture now employs moderators across 8 offices spanning the Philippines, India, Malaysia, Poland, Ireland, and the US
Facebook has several reasons to farm out moderation work…
Among them:
- Scalability: Utilizing contract labor allows Facebook to scale its efforts up or down globally as quickly as needed
- Cost: Contract workers are far cheaper to hire than Facebook employees (who have median annual earnings of $240k)
Perhaps the biggest reason, though, is the nature of the work.
- Moderators can view up to 700 posts per shift, and the content can be disturbing.
- An exposé by The Verge in 2019 highlighted examples of the toxic content moderators are subjected to, including videos of killings and animal cruelty.
As a result, many moderators have suffered mental health issues from the work. Facebook recently paid a $52m settlement covering 11k+ moderators who suffered from PTSD on the job.
So why is Accenture doing this?
One reason is money. Digital content moderation is expected to be a $8.8B industry by next year, and Accenture is currently pulling in $500m a year from Facebook alone.
The other is exposure to Silicon Valley, which Accenture was lacking prior to taking on the work. Time will tell if the firm will continue to feel that the weight of Facebook’s dirty work is worth it.