Politics

Anti-hate group calls for regulator to police social media platforms

The federal government should appoint a regulator with the power to force social media companies to disclose information to help fight far-right extremism, an anti-hate group told MPs Tuesday.

Facebook and Twitter monitored Ottawa convoy protest's social media posts

Police hold back far-right protesters during a demonstration in Montreal on Saturday, March 4, 2017. (Graham Hughes/The Canadian Press)

The federal government should appoint a regulator with the power to force social media companies to disclose information to help fight far-right extremism, an anti-hate group told MPs Tuesday.

Evan Balgord, executive director of the Canadian Anti-Hate Network, said an ombudsperson could put more pressure on tech companies to do more to reduce online harms.

"The basic idea is that you have an ombudsperson, a regulator, a well-resourced one, with investigatory powers so they can kick down the door of Facebook and take their hard drives," Balgord told members of the Commons public safety and national security committee studying "ideologically motivated violent extremism."

"I'm being a bit hyperbolic here but we know that these platforms hide data from us and lie to journalists, so we do need broad investigatory powers to investigate them."

Balgord said the regulator should be empowered to issue recommendations about the algorithms social media platforms use to engage with their audiences, and to take cases to court. He said platforms should face the threat of fines if they refuse to follow the regulator's recommendations.

Balgord was one of three experts who testified before the committee on Tuesday. All three described the rise of far-right extremism in Canada, enabled by social media.

Balgord drew a direct line from anti-Muslim groups through the Yellow Vest Canada protests to the convoy protest that paralyzed downtown Ottawa for three weeks and blocked border crossings. He pointed to the Jan. 6, 2021 mob assault on the Capitol Building in Washington, D.C. as an example of where such movements can lead.

Members of the audience wear yellow vests and one person wears a jacket with the logo of La Meute, a far-right group, as Prime Minister Justin Trudeau participates in a town hall Q&A in Saint-Hyacinthe, Quebec on January 18, 2019. (Ryan Remiorz/The Canadian Press)

"They're not all racist, they're not all violent," said Balgord. "Not all people on January 6 were either. There were groups in those midsts that decided that they were going to try to do a coup and they swept up a lot of the other people there.

"The same thing is kind of happening here. We have more extreme elements of our far-right movement than others, but as a whole, they are becoming a threat to our democracy,"

Barbara Perry, director of Ontario Tech University's Centre on Hate, Bias and Extremism, said the convoy protest showed "the risks and threats associated with the right-wing movement in Canada."

Perry said the convoy protest demonstrated a capacity to organize on a large scale through encrypted and unencrypted social media platforms.

Police officers push back protesters in front of the Senate of Canada building on Friday, Feb. 18, 2022. (Evan Mitsui/CBC)

"That was the venue through which they were able to display this adeptness that they really have in terms of their ability to exploit the broader popular concerns, grievances, anxieties, and weave them into their own narratives," she said.

Perry called for better law enforcement intelligence, saying police failed to properly evaluate the nature of the convoy protest. She also pointed out that some officers donated to the convoy or shared online conspiracy theories and misinformation.

Wendy Via, co-founder of the U.S-based Global Project Against Hate Extremism, told MPs that social media platforms are major drivers of hate speech and conspiracy theories and called on the government to hold them to account.

"The United States, Canada and many other countries are currently awash in hate speech and conspiracy theories like QAnon, anti-vax, election disinformation and the Great Replacement, spreading on poorly moderated social media," she said.

Former U.S. president Donald Trump speaks at a rally at the Canyon Moon Ranch festival grounds on January 15, 2022 in Florence, Arizona. (Mario Tama/Getty Images)

Via said American militia groups have established themselves on both sides of the border and people like former U.S. president Donald Trump have "legitimized hate and other extremist ideas."

"Research shows that Trump's campaign and politics galvanized Canadian white supremacist ideologies and movements and his endorsement of the trucker convoy, along with media personalities like Tucker Carlson, undoubtedly contributed to the influx of American donations to the trucker siege," she said.

Representatives of Facebook's owner Meta, meanwhile, told the committee that it monitored groups and accounts related to the truck convoy 24/7 once the convoy began and did not see hate speech or violent content in association with the protest.

"We did not see dangerous organizations, a significant amount of dangerous organizations and individual involvement in the convoy blockade and protest in Canada," said David Tessler, public policy manager for Meta.

Rachel Curran, public policy manager for Meta Canada, said some content that violated Facebook's community standards was removed but Facebook users are allowed to criticize the government online.

"Expressing opposition to government mandates is not against our community standards and so we allow that on our platforms," she said.

Michele Austin, Twitter's director of public policy for Canada and the U.S., said her company also monitored the truck convoy protest.

"We knew when it was arriving in Ottawa, we knew when it was taking place in Alberta and we exercised and enforced our rules where it was appropriate," Austin told CBC News after the committee hearing.

Austin said Twitter received reports from users. Convoy organizers were also talking about their plans on Twitter Spaces.

Tuesday's hearing came as speculation swirled over how billionaire Elon Musk's decision to buy Twitter and his pledge to promote free speech could change the social media platform.

Austin told MPs it is too early to know what might change and it could take months for Musk's purchase of Twitter to go through.

Both companies defended their actions related to extremism, saying they have invested money and hired staff to watch for it on their platforms. Curran said that, for example, 250 white supremacist groups have been banned from Facebook and the company works with law enforcement and intelligence agencies.

Curran said less than $10,000 was raised for the convoy protest on Facebook.

ABOUT THE AUTHOR

Elizabeth Thompson

Senior reporter

Award-winning reporter Elizabeth Thompson covers Parliament Hill. A veteran of the Montreal Gazette, Sun Media and iPolitics, she currently works with the CBC's Ottawa bureau, specializing in investigative reporting and data journalism. In October 2024 she was named a member of the International Consortium of Investigative Journalists. She can be reached at: elizabeth.thompson@cbc.ca.

With files from Chris Rands

Add some “good” to your morning and evening.

Your weekly guide to what you need to know about federal politics and the minority Liberal government. Get the latest news and sharp analysis delivered to your inbox every Sunday morning.

...

The next issue of Minority Report will soon be in your inbox.

Discover all CBC newsletters in the Subscription Centre.opens new window

This site is protected by reCAPTCHA and the Google Privacy Policy and Google Terms of Service apply.