World

Zuckerberg apologizes to families at heated U.S. Senate hearing on social media child safety

On Wednesday, the CEOs of Meta, TikTok, X and other social media companies went before the U.S. Senate judiciary committee to testify as lawmakers and parents grow increasingly concerned about the effects of social media on young people's lives.

Lawmakers, families of victims say platforms not doing enough to protect minors

Mark Zuckerberg apologizes to victims of online harm after heated U.S. Senate exchange

10 months ago
Duration 1:29
The head of Meta was challenged to apologize — on the spot — to families of children exposed to sexual content on his social media platforms as he answered questions about the actions taken to protect victims from online harm.

Sexual predators. Addictive features. Suicide and eating disorders. Unrealistic beauty standards. Bullying. These are just some of the issues young people are dealing with on social media — and children's advocates and lawmakers say companies are not doing enough to protect them.

On Wednesday, the CEOs of Meta, TikTok, X and other social media companies went before the U.S. Senate judiciary committee to testify as lawmakers and parents grow increasingly concerned about the effects of social media on young people's lives.

The hearing began with recorded testimony from children and parents who said they or their kids were exploited on social media. Throughout the hours-long event, parents who lost children to suicide silently held up pictures of their dead loved ones.

"They're responsible for many of the dangers our children face online," U.S. Senate Majority Whip Dick Durbin, a Democrat who chairs the committee, said in opening remarks. "Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk."

In a heated question-and-answer session with Mark Zuckerberg, Republican Sen. Josh Hawley of MIssouri asked the Meta CEO if he has personally compensated any of the victims and their families for what they have been through.

A person gestures while speaking in front of a sign.
At the Senate hearing on Wednesday, Republican Sen. Josh Hawley asked Meta CEO Mark Zuckerberg if he has personally compensated any of the victims and their families for what they have been through. (Jose Luis Magana/The Associated Press)

"I don't think so," Zuckerberg replied.

"There's families of victims here," Hawley said. "Would you like to apologize to them?"

Parents attending the hearing rose and held up pictures of their children. Zuckerberg stood as well, turning away from his microphone and the senators to address them directly.

"I'm sorry for everything you have all been through. No one should go through the things that your families have suffered," he said, adding that Meta continues to invest and work on "industry-wide efforts" to protect children.

'Dangerous products'

But time and time again, children's advocates and parents have stressed that none of the companies are doing enough.

"Meta's general approach is 'trust us, we'll do the right thing,' but how can we trust Meta? The way they talk about these issues feels like they are trying to gaslight the world," said Arturo Bejar, a former engineering director at the social media giant known for his expertise in curbing online harassment who recently testified before Congress about child safety on Meta's platforms.

WATCH | What social media scrolling is doing to kids' brains:

What social media scrolling is doing to kids’ brains

1 year ago
Duration 7:52
With most children and teenagers spending hours a day on a smartphone, CBC’s Christine Birak breaks down what research shows about how using social media is changing kids’ behaviour, if it's rewiring their brains and what can be done about it.

"Every parent I've met with a kid under 13 is afraid of when their kid is old enough to be in social media."

Hawley continued to press Zuckerberg, asking if he'd take personal responsibility for the harms his company has caused. Zuckerberg stayed on message and repeated that Meta's job is to "build industry-leading tools" and empower parents.

"To make money," Hawley cut in.

South Carolina Sen. Lindsay Graham, the top Republican on the judiciary panel, echoed Durbin's sentiments and said he's prepared to work with Democrats to solve the issue.

"After years of working on this issue with you and others, I've come to conclude the following: Social media companies as they're currently designed and operate are dangerous products," Graham said.

He told the executives that their platforms have enriched lives but that it is time to deal with "the dark side."

Federal bill in the works

Beginning with Discord's Jason Citron, the executives touted existing safety tools on their platforms and the work they've done with non-profits and law enforcement to protect minors.

Snapchat had broken ranks ahead of the hearing and began backing a federal bill that would create a legal liability for apps and social platforms that recommend harmful content to minors. Snap Inc. CEO Evan Spiegel reiterated the company's support on Wednesday and asked the industry to back the bill.

TikTok CEO Shou Zi Chew said TikTok is vigilant about enforcing its policy barring children under 13 from using the app. CEO Linda Yaccarino said X, formerly Twitter, doesn't cater to children.

People are look to their right as they are seen seated at a table.
From left: Discord CEO Jason Citron, Snapchat CEO Evan Spiegel, TikTok CEO Shou Zi Chew, X CEO Linda Yaccarino and Zuckerberg watch a video of victims shown during the U.S. Senate hearing on Wednesday. (Andrew Caballero-Reynolds/AFP/Getty Images)

"We do not have a line of business dedicated to children," Yaccarino said, adding that the company will also support Stop CSAM Act, a federal bill that will make it easier for victims of child exploitation to sue tech companies.

Yet child health advocates say social media companies have failed repeatedly to protect minors.

"When you're faced with really important safety and privacy decisions, the revenue in the bottom line should not be the first factor that these companies are considering," said Zamaan Qureshi, co-chair of Design It For Us, a youth-led coalition advocating for safer social media. "These companies have had opportunities to do this before they failed to do that. So independent regulation needs to step in."

Republican and Democratic senators came together in a rare show of agreement throughout the hearing, though it's not yet clear if this will be enough to pass legislation such as the Kids Online Safety Act, proposed in 2022 by Sen. Richard Blumenthal of Connecticut and Sen. Marsha Blackburn of Tennessee.

Meta emails under scrutiny

Meta is being sued by dozens of states that say it deliberately designs features on Instagram and Facebook that addict children to its platforms and has failed to protect them from online predators.

New internal emails between Meta executives released by Blumenthal's office show Nick Clegg, president of global affairs, and others asking Zuckerberg to hire more people to strengthen "well-being across the company" as concerns grew about effects on youth mental health.

"From a policy perspective, this work has become increasingly urgent over recent months. Politicians in the U.S., U.K., E.U. and Australia are publicly and privately expressing concerns about the impact of our products on young people's mental health," Clegg wrote in an August 2021 email.

An installation depicts two people sitting on a pile of cash and toasting with champagne while children look distraught.
An installation against social media companies, featuring Zuckerberg of Meta and Chew of TikTok, is displayed outside the U.S. Capitol building on Wednesday. (Julia Nikhinson/AFP/Getty Images)

The emails released by Blumenthal's office don't appear to include a response, if there was any, from Zuckerberg. In September 2021, the Wall Street Journal released the Facebook Files, a report based on internal documents from whistleblower Frances Haugen, who later testified before the Senate.

Meta has beefed up its child safety features in recent weeks, announcing earlier this month that it will start hiding inappropriate content from teenagers' accounts on Instagram and Facebook, including posts about suicide, self-harm and eating disorders.

It also restricted minors' ability to receive messages from anyone they don't follow or aren't connected to on Instagram and on Messenger, and it added new "nudges" to try to discourage teens from browsing Instagram videos or messages late at night. The nudges encourage kids to close the app, though it does not force them to do so.

Google's YouTube was notably missing from the list of companies called to the Senate on Wednesday, even though more kids use YouTube than any other platform, according to the Pew Research Center. Pew found that 93 per cent of U.S. teens use YouTube, with TikTok a distant second at 63 per cent.