Google defends efforts to prevent hate speech, extremism online
Company says it's using machine learning to find and remove YouTube videos that incite hate or violence
Google says it is working hard to eliminate hate speech from platforms like YouTube, using machine learning to catch problematic online content — sometimes before they've been seen by anybody.
However, Colin McKay, head of government affairs and public policy for Google Canada, said today the Canadian government should make it clearer to big tech companies what exactly constitutes unacceptable or hate speech online.
"I think the first step is to actually have a clear idea of what the boundaries are for terms like hate speech and violent extremist content," McKay told members of the justice committee studying online hate.
"Because we're still as a company interpreting and trying to define our perception of what society finds acceptable and what you as legislators and government find acceptable. The first step for us would rather be what is a clear definition, so we can act upon it.
"Because that is often where we have points of contention — what exactly is the expectation around take-down and restriction or limiting access on content, especially if it is related to hate as opposed to violent extremism."
McKay's comments come as lawmakers around the world grapple with the growing problem of hate speech and racism online. The justice committee has been holding hearings on the matter since mid-April.
The committee hearings began a month after a gunman opened fire in two mosques in Christchurch, New Zealand in March, killing 51 people. The shooter managed to livestream for several minutes before being taken offline. Since the attack, that livestream video has continued to circulate online, despite efforts to have all copies of it taken down.
McKay said Google participated in last month's Christchurch Call to Action Summit — a gathering of countries, including Canada, to discuss ways to deal with terrorist radicalization and violent extremism online.
Testifying today before justice committee hearings on the spread of online hate speech, McKay said Google and YouTube removed 228 million comments that broke company guidelines in the first quarter of 2019.
"From January to March 2019, we removed over 8.2 million videos for violating YouTube's Community Guidelines," McKay told MPs. "Seventy-six per cent of those videos were first flagged by machines rather than humans. Of those detected by machines, 75 per cent had not received a single view.
"We have also cracked down on hateful and abusive comments, again by using smart detection technology and human reviewers to flag, review and remove hate speech and other abuse in comments."
McKay said Google's terms of service state clearly ban incitements to violence or hate speech — but the company also wants to respect free speech.
"Our actions to address violent and hateful content, as is noted in the Christchurch Call, must be consistent with the principles of a free, open and secure internet, without compromising human rights and fundamental freedoms, including the freedom of expression," he said.
"We want to encourage the growth of vibrant communities, while identifying and addressing threats to our users and their societies."
McKay said Google has a team of 10,000 people around the world reviewing YouTube videos to determine whether they should remain online. He conceded that "very few" of the reviewers are based in Canada.
McKay said the amount of time it takes to remove content on YouTube varies, depending on "the context and the severity of the material."
"In the context of the Christchurch attack, we found that there were so many people uploading ... so quickly that we had to accelerate our artificial intelligence review of the videos and make on-the-fly decisions about taking down video based on it being substantially similar to previous uploads.
"In that process, that manual review was shortened extremely because we were facing a quantity."
Facebook is scheduled to appear before the justice committee when it resumes its hearings into online hate Thursday.
Elizabeth Thompson can be reached at elizabeth.thompson@cbc.ca