British Columbia

First orders issued under B.C.'s new intimate images act

A month after the introduction of legislation aimed at combatting online harm, Niki Sharma says three orders have been issued in relation to the non-consensual dissemination of intimate images.

Attorney General says province considering further action against platforms serving as 'bad actors'

A person's hand is seen holding a mobile device in the dark.
B.C. Attorney General Niki Sharma said the pain involved with the spread of unwanted sexual images limits the release of details, but she's hoping word of the program's success will encourage other victims to step forward. (iHaMoo/Shutterstock)

A month after introducing legislation aimed at combating online harm, B.C.'s Attorney General says three orders have been issued for the non-consensual dissemination of intimate images.

In a wide-ranging interview about the province's new Intimate Images Protection Act, Niki Sharma said the mechanism set up to deal with complaints under the new law has already seen more than 20 applications — and is providing counselling to 17 people.

Sharma said the pain involved with the spread of unwanted sexual images limits the release of details of the outcomes, but she's hoping word of the program's success will encourage other victims to step forward.

"It's a very victim-driven process, so of course we respect the confidentiality of all the people involved in the process. Especially since it's related to intimate images," she told CBC.

"Just so people know that there are results coming through this process, and that people should know that it's available to them."

Sharma's office contacted the CBC in relation to a story highlighting the global trend behind a sextortion case which resulted in charges against a Nigerian suspect, in relation to the suicide of a Surrey, B.C., teen in 2023.

Adedayo Olukeye, now awaiting trial in Lagos, is one of several suspects connected with a loosely affiliated group of West African scam artists known as the Yahoo Boys who are terrorizing teens by tricking them into providing sexual images which they then threaten to release online.

A South Asian woman speaks with tears in her eyes.
B.C. Attorney General Niki Sharma says three orders have been issued against individuals through the Intimate Images Protection Act in the past month. (Ben Nelms/CBC)

The Intimate Images Protection Act is a civil process handled through B.C.'s small claims court — the Civil Resolution Tribunal — intended to give victims a quick way to remove images, videos and deepfakes posted online without their permission.

Under the terms of the legislation, the tribunal can issue protection orders against individuals or companies requiring an intimate image to be deleted, de-indexed and removed from a website or social media platform.

The tribunal can also award damages of up to $5,000 and order penalties for non-compliance.

3 orders issued to date

According to the tribunal, the first order was issued on Feb. 15. Twenty-six claims for orders had been made as of Friday.

Sharma said all three orders issued to date came about through complaints brought by adult victims against individuals, as opposed to companies.

The process is accompanied by new victim services resources specifically dedicated to handle what's expected to be a huge number of complaints.

"There's a team of five that are directed just towards this because we wanted to be ready for the numbers that we know are out there of people that are experiencing this," Sharma said.

"So you can go through the legal process or the criminal process, but also know you have a team to support you because we've seen really tragic outcomes otherwise."

'There should be some accountability'

Sharma's office is currently working on legislation to be introduced this spring enabling B.C. to proceed with the kind of lawsuit American states brought last year against the company that owns Facebook and Instagram.

Thirty-three states including California and New York are suing Meta Platforms Inc. for contributing to a youth mental health crisis by allegedly knowingly designing features that cause children to be addicted to its platforms.

The Meta logo on a lit-up screen.
Meta, the company which owns Facebook and Instagram, is facing a lawsuit from 33 U.S. states in relation to a youth mental health crisis. (Thibault Camus/The Associated Press)

Meta and other platforms including TikTok, SnapChat and YouTube are also facing dozens of lawsuits from school boards across the United States citing the role social media has played in enabling sextortion and other problems.

Sharma said she continues to meet with companies — like Google, Meta, Pornhub and Only Fans — responsible for the online platforms through which intimate images are disseminated.

She said it's her duty to consider the role of corporate "bad actors" in causing "public harm."

"We did that with the opioid crisis and the opioid companies when they knowingly distributed a product that was harmful to society and we as a society paid for it through tax money and health-care services," she said.

"With these algorithms or these bad actors that are causing harm to society, I think it is the role of a person in my position to say there should be some accountability for those harms through recovery of costs for the damage that you're doing. So we're also looking into that."

'A huge traumatic stress'

The 2012 death by suicide of Coquitlam, B.C., teenager Amanda Todd drew worldwide attention to the issue of online sexual harassment and predators who use the internet to target vulnerable youth.

A Dutch man was convicted in 2022 of extortion, two counts of possession of child pornography, child luring and criminal harassment of Amanda Todd following extradition to B.C. He is currently serving a six-year sentence after being returned to the Netherlands.

Amanda Todd smiles in a selfie. She is wearing a gold cross and a white top.
Amanda Todd died by suicide on Oct. 10, 2012, after posting a video on YouTube saying she had been blackmailed by an online predator. (Telus Originals)

Amanda Todd's mother, Carol, was present when Sharma introduced the Intimate Images Protection Act in March 2023. The law came into force on Jan. 29. Todd says she fears the process will still be intimidating.

"If you've been a victim, that's a huge traumatic stress," she said.

"The minute that the offender sees the image or has it described, they're going to know who it is, and they're going to assume that it was the person they victimized — and so there's another fear base."

Todd described the process to get images removed as a "living tool" to be adjusted and improved as data becomes available about its use and effectiveness.

Sharma said her office plans to keep a close eye on data from the Intimate Images Protection program.

At this point, she said she doesn't know that she has seen accurate statistics on the number of sextortion incidents — apart from higher-level data from police incident reports showing cases are on the rise.

"I think hopefully what we'll see with this process and the tracking of it now, we'll actually see the impact on British Columbians and how many people are facing it," she said.

"Right now, what I know is a lot of people aren't reporting it. There's shame and embarrassment ... that keeps them from speaking about it. I've heard of young people that this has happened to for years that aren't saying anything. It's a crime that sometimes is in hiding."


If you or someone you know is struggling, here's where to get help:

ABOUT THE AUTHOR

Jason Proctor

@proctor_jason

Jason Proctor is a reporter in British Columbia for CBC News and has covered the B.C. courts and the justice system extensively.