When the Twitch streamer known as QTCinderella discovered her face depicted on a nude, AI-generated body — becoming the latest victim of the “deepfake porn” phenomenon — telling her story publicly felt like the only thing she could do.
As she described in an interview for the CBC podcast Deepfake Porn Empire, she needed people to know what this kind of degradation felt like. It’s not that she wanted to talk about what happened; it just felt like the only way to actually do something about it.
Indeed, for anyone who discovers they’ve been violated in this way, there are very few options for meaningful recourse. It’s hard to sue an anonymous user, after all. And even if the perpetrator’s identity is known, a lawsuit is time-consuming and costly, both financially and emotionally.
But here in Ontario, the barriers are especially high. In fact, we’re the only province in Canada without a dedicated pathway for victims of non-consensual intimate images to pursue civil litigation. So for Ontarians seeking accountability for sexualized deepfakes, their best bet is to try to piece together other statutes, like defamation law, that were never intended for this purpose in the first place.
And the problem is only growing.
From teens bullying their classmates through AI-powered “nudify” apps, to entire websites dedicated to non-consensual deepfake porn, there’s no question that deepfakes are on the rise. And, according to the International AI Safety Report released in January, the problem is almost exclusively sexual in nature, with these kinds of images accounting for about 96 per cent of deepfakes.
It’s not just that the issue is becoming more prevalent. Deepfakes are also getting harder to distinguish from reality. In a 2023 Court of Quebec criminal case involving the technology, Justice Benoit Gagnon noted it was “impossible” to separate the real from the fake, warning that “the police have clearly entered a new era of cybercrime.”
Although the offender in that case was sentenced to almost six years, the law around non-consensual intimate images in Canada does not currently account for deepfakes; this amendment has been proposed, however, as part of the Protecting Victims Act tabled in December.
But according to Susie Lindsay, counsel at the Law Commission of Ontario, most victims don’t want a criminal case. What they really want, she says, are practical remedies: an easy way to have their photos removed and a way to seek damages. Yet unlike everywhere else in Canada, this pathway simply doesn’t exist in Ontario.
This is why our first task should be creating a simple, fast-tracked strategy for dealing with non-consensual intimate images, including deepfakes. As an example, we could look to B.C.’s online Civil Resolution Tribunal for intimate images, which accounts for both real and altered images, and links victims to a dedicated emotional support service.
A public consultation on the law around intimate images and deepfakes in Ontario is currently in the works, led by Lindsay as part of a new LCO project.
The rest of the country has long been aware of the gap. It’s time for us to get on board.
