Business

Facebook to introduce features on Instagram that nudge young people to take breaks

Facebook, in the aftermath of damning testimony that its platforms harm children, will be introducing several features, including prompting teens using its photo-sharing app Instagram to take a break and nudging them if they repeatedly look at the same content that's not conducive to their well-being.

Features will encourage youth to take breaks and to stop repeatedly looking at potentially harmful content

Silhouettes of mobile users are seen next to a screen projection of an Instagram logo in this picture illustration. Facebook, in the aftermath of damning testimony that its platforms harm children, will be introducing several features to promote healthy usage habits in teens. (Ruvic/Illustration/Reuters)

Facebook, in the aftermath of damning testimony that its platforms harm children, will be introducing several features, including prompting teens to take a break using its photo-sharing app Instagram and "nudging" them if they're repeatedly looking at the same content that's not conducive to their well-being.

The Menlo Park, Calif.-based Facebook is also planning to introduce new controls for adults on an optional basis so that parents or guardians can supervise what their teens are doing online. These initiatives come after Facebook announced late last month that it was pausing work on its Instagram for Kids project. But critics say the plan lacks details, and they are skeptical that the new features would be effective.

The new controls were outlined on Sunday by Nick Clegg, Facebook's vice-president for global affairs, who made the rounds on various Sunday news shows, including CNN's State of the Union and ABC's This Week with George Stephanopoulos, where he was grilled about Facebook's use of algorithms, as well as its role in spreading harmful misinformation ahead of the Jan. 6 Capitol riots.

"We are constantly iterating in order to improve our products," Clegg told Dana Bash on State of the Union Sunday. "We cannot, with a wave of the wand, make everyone's life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use."

Clegg said that Facebook has invested $13 billion US over the past few years in making sure to keep the platform safe and that the company has 40,000 people working on these issues.

The flurry of interviews came after whistleblower Frances Haugen, a former data scientist with Facebook, went before Congress last week to accuse the social media platform of failing to make changes to Instagram after internal research showed apparent harm to some teens and of being dishonest in its public fight against hate and misinformation.

Haugen's accusations were supported by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company's civic integrity unit.

Former Facebook employee Frances Haugen testifies before a U.S. Senate committee in Washington on Oct. 5. Haugen, who left Facebook in May, provided internal company documents to journalists and others, alleging that it consistently chooses profit over safety. (Drew Angerer/Getty Images)

Josh Golin, executive director of Fairplay, a watchdog for the children and media marketing industry, said he doesn't think introducing controls to help parents supervise teens would be effective, since many teens set up secret accounts anyway.

He was also dubious about how effective nudging teens to take a break or move away from harmful content would be. He noted Facebook needs to show exactly how it would implement it and offer research that shows these tools are effective.

"There is tremendous reason to be skeptical," he said. He added that regulators need to restrict what Facebook does with its algorithms.

Golin said he also believes that Facebook should cancel its Instagram project for kids.

When Clegg was grilled by both Bash and Stephanopoulos in separate interviews about the use of algorithms in amplifying misinformation ahead of the Jan. 6 riots, he responded that if Facebook removed the algorithms, people would see more, not less, hate speech — and more, not less, misinformation.

Clegg told both hosts that the algorithms serve as "giant spam filters."

Democratic Sen. Amy Klobuchar of Minnesota, who chairs the Senate commerce subcommittee on competition policy, antitrust, and consumer rights, told Bash in a separate interview on Sunday that it's time to update children's privacy laws and offer more transparency in the use of algorithms.

"I appreciate that he is willing to talk about things, but I believe the time for conversation is done," Klobuchar said, referring to Clegg's plan. "The time for action is now."