Politics

YouTube moves to ban neo-Nazi, Holocaust-denying videos

YouTube says it is cracking down on hate speech on its platform, banning videos that promote neo-Nazi ideology or claim that events like the Holocaust never took place.

Will also curb spread of videos promoting misinformation, such as phoney miracle cures

Silhouettes of two people holding laptops in front of the YouTube logo
YouTube has announced today it is intensifying efforts to remove racist material, hate speech and videos denying that historical events like the Holocaust took place. (Dado Ruvic/REUTERS)

YouTube says it is cracking down on hate speech on its platform, banning videos that promote neo-Nazi ideology or claim that events like the Holocaust never took place.

The move is expected to result in thousands of videos and channels being removed from the popular online platform.

But while the company will begin enforcing its new policy today, it admits it could take months before all the content is removed.

"The openness of YouTube's platform has helped creativity and access to information thrive," the company said in a statement posted Wednesday. "It's our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence."

YouTube's announcement comes as online platforms are under intensifying political criticism for allowing the kind of material to be posted online in the wake of the Christchurch mosque shooting in New Zealand in March, in which 51 people died and dozens more were injured.

The gunman successfully livestreamed the attack for several minutes online. Tech companies have struggled since then to remove copies of the video posted to their platforms — often with small changes made to thwart their AI-driven monitoring systems.

An elderly man touches the wall bearing the names of victims at Budapest's Holocaust Memorial Centre in Hungary. (Laszlo Balogh/Reuters)

YouTube and other online giants also have been accused of employing algorithms that promote the viewing of extreme content because it generates more traffic.

The House of Commons justice committee has been holding hearings on the spread of online hate and extremism.

In its post, YouTube says it moved in 2017 to limit discriminatory content, "including limiting recommendations and features like comments and the ability to see the video. This step dramatically reduced views to these videos (on average 80 per cent)."

Now, it's going further.

"Today, we're taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status," it wrote.

"This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory.

"Finally, we will remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place."

The lawn outside the U.S. Capitol is covered with 7,000 pairs of empty shoes to memorialize the children killed by gun violence since the Sandy Hook school shooting, in a display organized by the global advocacy group Avaaz in Washington, DC, March 13, 2018. (Saul Loeb/AFP/Getty Images)

However, the company isn't planning to completely eliminate the videos forever.

"We recognize that some of this content has value to researchers and NGOs looking to understand hate in order to combat it and we are exploring options to make it available to them in the future," said the company's statement. "And, as always, context matters, so some videos could remain up, because they discuss topics like pending legislation, aim to condemn or expose hate, or provide analysis of current events."

YouTube also is moving to prevent those who repeatedly come close to the line on hate speech from being able to make money through ads on their videos.

"Channels that repeatedly brush up against our hate speech policies will be suspended from the YouTube Partner program, meaning they can't run ads on their channel or use other monetization features like SuperChat," it wrote.

The company also will limit the distribution of "harmful misinformation," expanding an initiative piloted in the U.S. in January which reduces the spread of such videos by more than 50 per cent to more countries by the end of 2019. The company says the pilot project limits the spread of things like "videos promoting a phony miracle cure for a serious illness, or claiming the Earth is flat.

"Our systems are also getting smarter about what types of videos should get this treatment and we'll be able to apply it to even more borderline videos moving forward."

At the same time, the company said, it will promote authoritative content by suggesting it in the "watch next" panel.

Bernie Farber, chairman of the Canadian Anti-Hate Network, welcomed YouTube's announcement but said he's waiting to see how vigorously it follows up with action.

He said he's not surprised that YouTube is warning it could take a while to remove the offending videos.

"I can't even calculate the number of YouTube videos that are out there that are white-nationalist-based, Nazi-based," he said. "It's going to take a long time, but you've got to start somewhere."

Farber said Facebook has taken down a few accounts, but then appeared to stop, while Twitter has made no attempt to prevent its platform from being used for hate speech.

He said public opinion in the wake of the shootings in Quebec, Pittsburgh and Christchurch has helped to convince big tech platforms to address the problem.

"I think the public pressure has become unbearable."

Former Ontario Progressive Conservative candidate Andrew Lawton of the True North Centre — which describes itself as "an independent, non-profit research and educational organization dedicated to advancing sound immigration and security policies" — said private companies like YouTube have the right to decide who gets banned from their platforms, but they also have a lot of power.

"The desire to purge racist or hateful content from your platform is a noble one but the questions will always come back to who defines it and how it is defined," he wrote in an e-mail.

"This is especially concerning with a parliamentary committee weighing whether to regulate social media companies that allow what it's classing as hate speech. I fear the impact will be to excessively censor, to avoid the risk of running afoul of whatever the government decides hate speech is, if this goes ahead.

"At a certain point, these decisions by YouTube and other platforms to censor seem to appear more like editorial directives rather than being exclusively about cleaning up bona fide hate speech."

Elizabeth Thompson can be reached at elizabeth.thompson@cbc.ca

ABOUT THE AUTHOR

Elizabeth Thompson

Senior reporter

Award-winning reporter Elizabeth Thompson covers Parliament Hill. A veteran of the Montreal Gazette, Sun Media and iPolitics, she currently works with the CBC's Ottawa bureau, specializing in investigative reporting and data journalism. In October 2024 she was named a member of the International Consortium of Investigative Journalists. She can be reached at: elizabeth.thompson@cbc.ca.