Canada

She was careful online, but this Toronto teen was still targeted with deepfake porn

One of the worst things that can happen to a person, according to Ruby, a 16-year-old from Toronto, is to find a nude picture of yourself on the internet. And that's exactly what happened to her, through no fault of her own. 

'I didn't do anything wrong… I was filled with a lot of anger'

Two youths and two adults stand around a kitchen table.
Ruby, far right, seen here with her family, is a 16-year-old who recently learned a nude photo of her had been deepfaked. Police believe the culprit was using it as part of a phishing scheme. (Mia Sheldon/CBC)

One of the worst things that can happen to a person, according to Ruby, a 16-year-old from Toronto, is to find a nude picture of yourself on the internet. 

And that's exactly what happened to her, through no fault of her own. 

"Suddenly I was in that worst-case scenario," she said.

CBC News is not revealing Ruby's last name because she was a victim of a worrying new trend — sexually explicit deepfakes of minors.

A deepfake is any image or video that has been altered or created, often using artificial intelligence, that is difficult to distinguish from the real thing. 

Last year, Ruby got a series of messages from someone saying there were images of her online, asking her to click a link to see them. She asked to see them, and was sent a deepfake of herself, topless. The original photo, in which she was fully clothed, had been taken when she was 13. 

Two women face each other, standing in an office.
Lindsey Lobb, left, with the Canadian Centre for Child Protection, says deepfakes are often used to extort, harass or bully minors, and are easy to make because of the many sites and apps that will 'nudify' an image. (Mia Sheldon/CBC)

Ruby had been taught how to be safe online and she doesn't post under her real name. 

"I didn't do anything wrong," she said. "It just happened… I was filled with a lot of anger."

"I have no idea who this person is." 

Ruby's parents called Cybertip.ca, a national hotline for people to report sexually explicit images of minors. It processed 4,000 of sexually explicit deepfakes in the past year, the first year it starting tracking numbers. 

The problem "has continued to grow and evolve," said Lindsay Lobb, the operations director of support services at the Canadian Centre for Child Protection, which runs Cybertip.ca.

The deepfakes are often used to extort, harass or bully minors, she says, and are easy to make because of the many sites and apps that will "nudify" an image. 

'Massive generational leaps'

There have been some high-profile cases of sexually explicit deepfakes circulating in Canadian and U.S. high schools.

In Canada, any sexual explicit image of a minor, deepfake or not, is illegal and considered child pornography. Online and social media platforms say they report images found on their sites to police.

And it will only get harder to distinguish deepfakes from the real thing, says Brandon Laur, an online safety educator.

"Every year we are going to see massive generational leaps in the realness of those images," he said. 

Even now, he says, parents don't always believe children when they say the images are not real. 

WATCH | AI can turn any photo into deepfake porn. It's almost impossible to prevent: 

AI can turn any photo into deepfake porn. It’s almost impossible to prevent

9 days ago
Duration 9:49
It only takes seconds for AI to turn an innocent photo into something pornographic that can be distributed online. CBC’s Ellen Mauro breaks down how the images are being used illegally, why it’s nearly impossible to stop and sees first-hand how easy these deepfakes are to make.

Laur says it's unrealistic to expect people not to post online, but wants to raise awareness that once an image is up, even with secure settings, it's virtually impossible to control what happens to it. 

The RCMP and other police services have expressed concern over the appearance of these types of images. But legal recourse can be difficult says Molly Reynolds, a lawyer at Tory's LLP in Toronto, who has represented adult victims in civil cases. 

Deepfakes can be made by ex-partners for revenge, by fellow colleagues and students to threaten and bully, or by strangers in other countries.

"If a stranger just takes your image anywhere in the world and turns it into deepfake, it can be very challenging to find a legal path in Canada to stop that," Reynolds said.

Reynolds says victims can submit a takedown request to the site — such as Google or Meta — hosting an image. 

After that, "there may be civil law or criminal law routes to make a claim for harassment," she said. 

In Ruby's case, police did not find evidence the image was distributed online. They believe the person who contacted her was trying to hack into her iCloud in an elaborate phishing scheme. 

She remains shaken and wants people to know this can happen to them. 

"What's still being taught around cybersecurity is that nothing ever leaves the internet — and to be safe, don't take nude photos," she said. "And that's true. But now it's an entirely other ballgame."

WATCH | Real or deepfake? 

Can you spot the deepfake? How AI is threatening elections

10 months ago
Duration 7:08
AI-generated fake videos are being used for scams and internet gags, but what happens when they’re created to interfere in elections? CBC’s Catharine Tunney breaks down how the technology can be weaponized and looks at whether Canada is ready for a deepfake election.

ABOUT THE AUTHOR

Ellen Mauro is a senior reporter based in Toronto, covering stories in Canada and beyond, including recent deployments to Haiti and Afghanistan. She was formerly posted in Washington, D.C. where she covered the Trump White House for CBC News. Previously, she worked at CBC's London, U.K. bureau where she covered major international news stories across Europe and Africa.