AI makes deepfake pornography more accessible, as Canadian laws play catch-up
Brieanna Charlebois | The Canadian Press | Posted: February 3, 2024 10:17 PM | Last Updated: February 3
B.C. recently became latest province to pass laws allowing people to take down explicit content of them online
Underage Canadian high school girls are being targeted using AI, which create fake explicit photos that spread online. Google searches bring up multiple free websites capable of "undressing" women, in a matter of minutes.
The technology required to create convincing fake pornography has existed for years, but experts warn it's faster and more accessible than ever, creating an urgent challenge for Canadian policymakers.
Advances in artificial intelligence have made it possible to do with a cellphone what once would have required a supercomputer, said Philippe Pasquier, a professor of creative AI at Simon Fraser University in B.C.
Pasquier said society has "lost the certainty" of what is real and what is altered.
"The technology got a little better in the lab, but mostly the quality of the technology that anyone and everyone has access to has got better," he said.
"If you increase the accessibility of the technology, that means good and bad actors are going to be much more numerous."
Across Canada, legislators have been trying to keep up. Eight provinces have enacted intimate image laws, but only half of them refer to altered images.
B.C. recently became the latest, joining Prince Edward Island, Saskatchewan and New Brunswick.
The B.C. law, which came into effect on Jan. 29, allows people to go to a civil resolution tribunal to get intimate images taken down, regardless of whether they are real or fake, and go after perpetrators and internet companies for damages.
Individuals will be fined up to $500 per day and websites up to $5,000 a day if they don't comply with orders to stop distributing images that are posted without consent.
Premier David Eby said the recent sharing of fake images of pop star Taylor Swift proved no one was immune to such "attacks."
Attorney General Niki Sharma said in an interview that she is concerned people don't come forward when they are the victim of non-consensual sharing of intimate images, real or not.
"Our legal systems need to step up when it comes to the impacts of technology on society and individuals, and this is one part of that," she said of the new legislation.
The province said it couldn't provide specific data about the extent of AI-altered images and deepfakes.
But cases have occasionally been made public elsewhere.
In December, a Winnipeg school notified parents that AI-generated photos of underage female students were circulating online.
At least 17 photos taken from students' social media were explicitly altered using artificial intelligence. School officials said they had contacted police and made supports available for students directly or indirectly affected.
Manitoba has intimate image laws, but they don't refer to altered images.
WATCH | Why Brandon Laur says sextortion cases won't go away in Canada:
Victoria-based internet safety company White Hatter recently conducted an experiment and found it took only minutes using free websites to virtually undress an image of a fully clothed woman, something CEO Brandon Laur called "shocking."
The woman used in the experiment wasn't real — she was also created with AI.
WATCH | Experts say it's difficult to get deepfake images taken down:
"It's pretty surprising," Laur said in an interview. "We've been dealing with cases [of fake sexual images] since the early 2010s, but back then it was all Photoshop.
"Today, it's much simpler to do that without any skills."
Legal avenues, new and old
Angela Marie MacDougall, executive director of Battered Women's Support Services, said her organization was consulted about the B.C. legislation.
She said Swift's case underscored the urgent need for comprehensive legislation to combat deepfakes on social media, and applauded the province for making it a priority.
But the legislation targets non-consensual distribution of explicit images, and the next "crucial step" is to create legislation targeting creators of non-consensual images, she said.
"It's very necessary," she said. "There's a gap there. There's other possibilities that would require having access to resources, and the women that we work with wouldn't be able to hire a lawyer and pursue a legal civil process around the creation of images … because, of course, it costs money to do that."
But other legal avenues may exist for victims.
Suzie Dunn, an assistant law professor at Dalhousie University in Halifax, said there were several laws that could apply to deepfakes and altered images, including those related to defamation and privacy.
"There's this new social issue that's coming up with AI-generated content and image generators and deepfakes, where there's this kind of new social harm that doesn't fit perfectly in any of these existing legal categories that we have," she said.
She said some forms of fakery could deserve exceptions, such as satire.
"As technology evolves, the law is constantly having to play catch-up and I worry a bit with this, that there might be some catch-up with this generative AI."
Deepfakes 'accelerating' misrepresentation
Pablo Tseng, an intellectual property lawyer in Vancouver, said deepfakes are "accelerating" an issue that has been around for decades: misrepresentation.
"There's always been a body of law that has been targeted towards misrepresentation that's been in existence for a long time, and that is still very much applicable today to deepfakes, (including) the torts of defamation, misrepresentation or false light, and the tort of misappropriation of personality."
But he said specific laws, like the B.C. legislation, are steps in the right direction.
WATCH | B.C. introduces act to take down intimate online images:
Tseng said he knew of one Quebec case that showcased how the misuse of deepfake technology could fall under child pornography laws. That case led to a prison sentence of more than three years for a 61-year-old man who used AI to produce deepfake child pornography videos.
But Tseng said he wasn't aware of any judgment in which the technology is referenced in the context of misrepresentation.
"It's clear that just because no judgment has been rendered doesn't mean that it isn't happening all around us. Taylor Swift is but the latest example of a string of other examples where celebrities' faces and personalities and portraits have simply been misused," he said.
Dunn said she believed content moderation by websites was likely the best way forward.
She called on search engines like Google to de-index websites primarily focused on creating sexual deepfakes.
"At a certain point, I think some people just give up, even people like Scarlett Johansson or Taylor Swift, because there's so much content being produced and so few opportunities for legal recourse because you would have to sue every individual person who reshares it," Dunn said.
She said while most video deepfakes involve celebrities, there are cases of "everyday women" being targeted.
"All you need to have is one still image of a person, and you can feed it into these nude image generators and it just creates a still image that looks like they're naked, and most of that technology only works on women."