The Current

Facebook isn't doing enough to control violent posts, says expert

Violent acts being filmed and posted to social media have many asking whether it's possible to reverse the trend of violent video sharing.
The mother of 19-year-old Serena McKay is extremely disturbed by the existence of a video showing her daughter being beaten to death. Facebook CEO Mark Zuckerberg has said they are working on preventing sharing or streaming videos of crime. (CBC)

Read story transcript

Serena McKay was just 19 when she was killed in Sagkeeng First Nation in northern Manitoba.

And the crime was made all the more disturbing by a video posted on Facebook that appears to show McKay being violently assaulted.

Two teenaged girls have been charged with second degree murder in McKay's death.

Facebook has removed the video — though it's still working on stopping it from being shared on the Messenger app.

It's the latest video posted on Facebook, showing a disturbing incident of real violence, either posted or streamed live.
And it is raising questions about how social media should handle this disturbing content.

Jacqueline Helfgott, a professor of criminal justice at Seattle University, says that for some people, social media can be a motivating factor for a crime.
Serena McKay, 19, was found dead, April 23, in Sagkeeng First Nation. Two teenage girls have been arrested in connection with her death. (Submitted by family)

"Social media creates opportunities for people who aren't famous to be famous," Helfgott tells The Current's Friday host Laura Lynch.

"And there's an element of that in these types of crimes."

Facebook CEO Mark Zuckerberg has said his organization is working on preventing the sharing or streaming of videos of crime. But experts are asking how exactly that should be done.

I don't think Facebook is putting enough effort in ... and this is probably true of most of the tech companies.- Hany Farid

"Even with all of the advances in machine learning and AI, we are nowhere near being able to filter content like that at internet scale," says Hany Farid, a professor of computer science at Dartmouth College.

"There is a gap between what we can do and what we want to do."

However, Daphne Keller, the director of intermediary liability at the Stanford Law School Center for Internet and Society, questions whether machine monitoring is something we should even want to do.

"The idea that we can have an automated machine that can detect what's illegal from what's legal is pretty risky," Keller tells Lynch.
'I can never unsee it,' says elder Alma Kakinapinace who found Serena McKay's body in Sagkeeng First Nation. (CBC)

Farid advocates bringing people into the decision process, to work with machines in making sure decisions on what content to take down are ethical. He says it's a complex process that would require an investment of time, staff and money.

"I don't think Facebook is putting enough effort in ... and this is probably true of most of the tech companies," he says.

"They put a huge amount of effort into data mining. They put a huge amount of effort into advertising. They put a huge amount of effort into the latest features, like Live. And I think the efforts that are going into really making sure these platforms are safe are much, much less," Farid points out.

Do we want a system where that private actor is systematically erring on the side of caution?-  Daphne Keller

Keller, however, sees bigger ethical problems with the monitoring effort altogether.

"Platforms like Facebook are the public square," she says.

"Unlike the public square historically, they're run by a private company that is subject to media pressures and legal pressures and can choose to just silence difficult speech, or controversial speech, or minority speech if that is what makes the media and shareholders happier," Keller tells Lynch.

"Do we want a system where that private actor is systematically erring on the side of caution?"

Listen to the full segment at the top of this web post.

This segment was produced by The Current's Samira Mohyeddin, Catherine Kalbfleisch and Seher Asaf,