The Hustle

Facebook is paying Accenture $500m a year to moderate content on its platforms

A recent report revealed Accenture to be Facebook’s biggest content moderation partner, pulling in $500m a year for the work.

Photo by Ilana Panich-Linsman for The Washington Post via Getty Images

Back in 2007, a settlement with the state of New York required Facebook to remove pornographic posts within 24 hours.

At first, Facebook tasked its own employees with the job — but they were quickly overwhelmed and the company turned to automation and outsourcing.

Today, 90%+ of flagged content is removed by AI…

… while the other 10% falls on human moderators.

Accenture took on the work when Facebook moved on from its original content moderation partner.

The scope of the agreement escalated quickly:

Facebook has several reasons to farm out moderation work…

Among them:

  1. Scalability: Utilizing contract labor allows Facebook to scale its efforts up or down globally as quickly as needed
  2. Cost: Contract workers are far cheaper to hire than Facebook employees (who have median annual earnings of $240k)

Perhaps the biggest reason, though, is the nature of the work.

As a result, many moderators have suffered mental health issues from the work. Facebook recently paid a $52m settlement covering 11k+ moderators who suffered from PTSD on the job.

So why is Accenture doing this?

One reason is money. Digital content moderation is expected to be a $8.8B industry by next year, and Accenture is currently pulling in $500m a year from Facebook alone.

The other is exposure to Silicon Valley, which Accenture was lacking prior to taking on the work. Time will tell if the firm will continue to feel that the weight of Facebook’s dirty work is worth it.

Get the 5-minute roundup you’ll actually read in your inbox​

Business and tech news in 5 minutes or less​

Exit mobile version