The Current

Canada could lead a coalition to force change at Facebook, says whistleblower Frances Haugen

Former Facebook employee turned whistleblower Frances Haugen disclosed thousands of documents about the social media giant last year, and testified before U.S. senators that it chooses profits over the safety and well-being of its users.

Haugen worked at Facebook for 2 years and claims company chooses profits over safety

A white woman with blonde hair gestures as she speaks into a mic.
Facebook whistleblower Frances Haugen appears before U.S. senators on Oct. 5, 2021, in Washington, D.C. (Drew Angerer/Getty Images)

Read Story Transcript

Canada has an "amazing opportunity" to lead a coalition of smaller countries in demanding accountability from Facebook, says former employee-turned-whistleblower Frances Haugen.

"If you can get together 100 million, 200 million people — you know, countries — you will be able to force change from Facebook," said Haugen, who left her role as a product manager at the social media giant last May and disclosed thousands of internal company documents to the media.

Haugen has accused Facebook of putting profits before the well-being of its users — from failing to protect children and their mental health, to fuelling misinformation and inciting political violence. She's also called for stricter government oversight to address these problems.

"I have faith that Canada could be a leader in driving that change," she told The Current's Matt Galloway.

Haugen said she disagrees with the idea that larger powers, like the U.S. and European Union, should lead the charge.

She pointed to the U.K., which last year brought in sweeping regulations (and potentially heavy fines) around how websites and apps interact with children online, and how their data can be used.

WATCH | Canada could lead 'coalition for change,' says Facebook whistleblower:

Canada could lead a coalition to force change from Facebook: whistleblower

3 years ago
Duration 2:43
Smaller countries could band together to demand change from the tech giant, says former Facebook employee Frances Haugen

As a result of the new rules, TikTok stopped sending younger users notifications later in the evening and YouTube removed the autoplay function on videos for users aged 13-17. Facebook exempted users under 18 from some forms of targeted advertising, while the company's Instagram platform made accounts for teen users private by default.

Haugen said Facebook made the changes globally because it's difficult to customize services across multiple countries.

A coalition of countries led by Canada could call for similar changes, she said, through co-ordinated legislation that demands transparency and accountability from the company.

"The reality is, Facebook has to live in democracy's house, right?" she said.

After Haugen went public in October, social media experts told CBC News that social platforms were unlikely to fix the problems on their own and needed government involvement.

The federal Liberal government tabled three bills around online protections during the last session of Parliament: 

The federal government tabled three bills around online protections for Canadians in the last Parliament, but all died when the election was called in August. (Jenny Kane/Associated Press)

All three bills died when Parliament was dissolved in August ahead of the federal election, though the Liberals have pledged to resurrect them. 

In a statement, a spokesperson for Canadian Heritage Minister Pablo Rodriguez said the federal government is committed to protecting Canadians from harmful content online and to holding "social media platforms and other online services accountable for the content they host."

Canadian Heritage is co-leading federal efforts to regulate Internet giants.

The statement also pointed to a consultation that ran from July to September last year, which asked the public and interested parties for feedback on how to tackle the issue. It said legislation is now being developed based on that consultation.

'Impasse' over fixing problems

Haugen started working at Facebook in 2019, hoping to help solve problems around misinformation on the platform, but said she quickly noticed an "impasse."

"The problem is the people whose job is to find these problems, and the people whose job is to authorize fixing these problems, are different people," she said.

If fixing a problem doesn't align with the incentives of those authorized to fix it — such as company growth — it didn't get fixed, she said.

Haugen said she believes that Meta CEO Mark Zuckerberg 'has to learn to be a slightly different leader now.' (Marcio Jose Sanchez/The Associated Press)

Part of Haugen's work was on the civic integrity team, which she described as being tasked with making Facebook a "positive force in politics." But that team was dissolved a month after the 2020 U.S. presidential election, its staff moved to a broader safeguarding team that did not have a specific mandate on politics.

That's when Haugen decided to go public with her concerns.

"It showed a level of lack of commitment and, like, a blindness, that I was like, 'This is just not acceptable.' You can't have a force that's this dangerous that thinks itself as safe," she said.

Haugen took pictures of internal documents before she left, which became the basis of a series of Wall Street Journal exposés.

Facebook didn't set out to 'incentivize rage'

Among her allegations were that the company was aware that its Instagram platform could have a negative impact on the body image and mental health of its users, but it failed to take action — something that was of particular concern to U.S. lawmakers when Haugen testified before a Senate committee last October.

At the time, Facebook responded that "the story focuses on a limited set of findings and casts them in a negative light," but it stood by the research.

Haugen also alleged that an algorithm change in 2018 prioritized showing users content with more comments or shares, but that much of that engagement was negative, such as people arguing within comment threads. 

Though the new algorithm brought more eyes to divisive content, she said, it also increased the amount of time users spent on the platform, which in turn increased revenue from digital ad sales.

WATCH | Facebook chooses profits over safety, Haugen testifies:

Former Facebook data scientist asks Congress to intervene in social media company’s actions

3 years ago
Duration 2:37
Former Facebook data scientist-turned-whistleblower Frances Haugen urged U.S. lawmakers to intervene in the social media giant's operations. Speaking before a Senate panel, Haugen outlined how Facebook knew its products and algorithms were steering users toward dangerous and toxic content, yet did nothing about it.

Haugen told The Current she didn't think the company set out to "incentivize rage," but it happened amid an overall drive to increase interaction on the site. "They didn't spend enough on safety systems or on people watching for these problems. That was the real issue," she said.

In an email statement to The Current, a spokesperson for Meta, the parent company of Facebook, said the premise at the centre of Haugen's claims is "false."

"Yes, we're a business and we make profit, but the idea that we do so at the expense of people's safety or well-being misunderstands where our own commercial interests lie."

The statement further said Facebook has "over 40,000 people to do one job: keep people safe on our services."

I think we probably need a different leader to come in because he hasn't demonstrated a willingness to change.- Frances Haugen on Meta CEO Mark Zuckerberg

Haugen said advocates have long raised concerns about Facebook's operations and impact, but transparency has been a key problem. 

When concerns are raised, she said, the company will often downplay external evidence as "anecdotal," without revealing their own investigations into problems or outlining what corrective action is taken.

She described a hypothetical scenario in which there are concerns about children being exposed to posts about self-harm. Through legislation, she said, Facebook could be compelled to track and report how many children are seeing that content and how often. 

"Imagine a world where that number is reported. Would Facebook get better about self-harm content? Almost certainly. So we have to change that dynamic," she said.

Zuckerberg hasn't shown 'willingness to change'

After Haugen's testimony last October, CEO Mark Zuckerberg said the allegations mischaracterized Facebook's work and priorities.

Later that month, Facebook Inc. rebranded to Meta, with Zuckerberg laying out a vision of a digital world where people can use avatars to play games together or attend virtual concerts.

To Haugen, this pivot to "video games and the metaverse" shows Zuckerberg is still primarily interested in growing the company — something he has been richly rewarded for over the years — rather than addressing its problems. 

WATCH | Mark Zuckerberg has been richly rewarded for site's growth, says whistleblower:

Is Mark Zuckerberg the right person to fix Facebook?

3 years ago
Duration 1:59
For years, Zuckerberg’s focus has been on growth and the rewards it brought, says whistleblower Frances Haugen

She expressed empathy that "Mark is in a really hard place because he has to learn to be a slightly different leader now," but said that's something she's not sure he's willing to do — and the company may require a new leader.

"Though I do have faith that if he wanted to take that path, he could do it," she said.

Haugen also said that she would work for Facebook again, if the company asked her back.

"I believe this is the biggest problem in the world and we have to, have to, have to solve it."


Written by Padraig Moran. Produced by Ben Jamieson.

Add some “good” to your morning and evening.

Get the CBC Radio newsletter. We'll send you a weekly roundup of the best CBC Radio programming every Friday.

...

The next issue of Radio One newsletter will soon be in your inbox.

Discover all CBC newsletters in the Subscription Centre.opens new window

This site is protected by reCAPTCHA and the Google Privacy Policy and Google Terms of Service apply.