The Current

Efforts to block hate speech on Facebook actually work to discriminate against minorities, critics say

As Facebook looks to expand its fleet of moderators, critics say the current system promotes biased decision-making against marginalized people.

People who discuss racist abuse online are often punished for repeating the language used against them

When it comes to moderating content on social media sites, who decides? And what are their biases? (Justin Sullivan/Getty Images)

Read Story Transcript

Facebook has announced it will double its fleet of newsfeed integrity data specialists — the moderators who remove hate speech and violent content — from 10,000 to 20,000 people by the end of 2018. But critics say the current system promotes biased decision-making against racially and politically marginalized people.

To discuss the pervasive problem of digital bias, and how we can combat it, The Current's guest host Gillian Findlay spoke to:

  • Daphne Keller, the intermediary liability director at the Center for Internet and Society at Stanford Law School.
  • Elizabeth Dubois, an associate professor of communications at the University of Ottawa and a fellow at the Public Policy Forum.

The Current requested a statement from Facebook to respond to some of the concerns raised by our guests, but we have not heard back. 

Listen to the full conversation at the top of this page.


This segment was produced by The Current's Pacinthe Mattar.