Act now in force in B.C. to help remove intimate images posted online without consent
Victims wanting images removed can now apply through B.C. Civil Resolution Tribunal
Starting Monday, residents who have their images posted online without their approval will be able to apply through the B.C. Civil Resolution Tribunal (CRT) to have the photos, videos or deep fakes expeditiously removed, and even to be compensated for the sexualized violence.
B.C. is the ninth province in Canada to draft and enact an Intimate Images Protection Act (IIPA). It provides a path for victims to regain control of their private images and for perpetrators to be held accountable.
"You can see that the government is really trying to create a deterrent effect for this type of behaviour," said Claire Feltrin, a data privacy and cybersecurity lawyer in Vancouver who helped advise the tribunal over how to adapt to the act and handle cases.
The invasion of privacy and dignity is an ongoing problem across the country. The Canadian Centre for Child Protection says millions of cases of suspected child sexual abuse material on sites such as as Facebook, Instagram, Tiktok and Pinterest are flagged each year.
In 2022, it said nearly 32 million cases were recorded.
"We have the ability to force that image to come down … there's a process in place to ensure that that happens," said Premier David Eby at a news conference about cybersecurity and youth on Friday.
On Monday, he said fake, AI-generated images of pop icon Taylor Swift that went viral over the weekend underscored the need for these types of protections.
"If Taylor Swift is not immune from this, certainly British Columbians are not," Eby said at an unrelated event in Ottawa.
B.C. Attorney General Niki Sharma announced the act in March 2023. It streamlines the process for images to be taken down, and will give victims an avenue they can use to claim compensation from people who shared their photos without permission.
The province says the legislation, civil rather than criminal, will cover intimate images, near-nude images, videos, livestreams and digitally altered images and videos, such as those known as deep fakes.
The act requires perpetrators to destroy the images and remove them from the Internet, search engines and all forms of electronic communication. It also covers threats to distribute intimate images.
"If it's not your image, you don't have the right to share it," said Sharma on Friday.
B.C.'s intimate images act complements laws enshrined in Canada's Criminal Code in 2015 in response to public outrage over the suicides of Canadian teenagers Amanda Todd and Rehtaeh Parsons, who were targets of cyberbullying and sextortion.
Under the IIPA, the CRT can order an administrative penalty if individuals or internet intermediaries, such as social media companies, don't comply.
The penalties are $500 per day up to a maximum of $10,000 for individuals and $5,000 per day up to a maximum of $100,000 for internet intermediaries.
The CRT can also award damages to victims of up to $5,000 for harm caused by having the images online without consent.
How to make a claim
Starting Monday, people age 14 and older can bring CRT intimate image protection order claims on their own, or with the help of a trusted adult through a portal on the CRT's website.
People age 12 and 13 must engage the help of a trusted adult to make a claim, while those under 12 can only make a protection order claim through a litigation guardian, which is a special type of representative, usually a parent or legal guardian, the CRT said in an email.
For any other type of CRT claim, such as a claim for damages, people under 19 must have a litigation guardian.
CRT said the solution explorer is free, anonymous and available 24/7.
It will have applicants answer simple questions and then provide customized legal information and options based on answers. Feltrin said ultimately applications made through the tribunal should expedite the removal of images compared to seeking their removal through the criminal justice system.
"The process should be extremely efficient," she said.
Sharma did not say on Friday how long it would take — days, weeks or months — to remove images, but promised to continue to meet with companies to make sure it's expeditious.
Anonymous internet
While advocates such as Feltrin are applauding the act, they say it does have limitations.
If an internet intermediary can show it's taken reasonable steps to "address" the unlawful distribution of intimate images, then its liability will be reduced.
Lawyers like Feltrin also say the act will be difficult to enforce for cases where the identity of the person posting the images is unknown or anonymous.
"The anonymity of the internet is a pervasive issue when it comes to a lot of cyber crimes and to the extent that an individual is not aware of who distributed an image or is threatening to distribute such an image, those prevention orders will be difficult to have a good effect," she said.
The province announced other measures Friday around privacy and cybersecurity involving youth.
They include a new support system made available for victims of intimate images of sexual violence, working with schools to limit cellphone use in classrooms starting in September, and proposed new legislation to hold companies accountable for the harms their products may have caused the public.
If passed it would allow the government to recover costs caused by harms from social media platforms and their algorithms.
With files from Yvette Brend and the Canadian Press