Victims, most of whom are women, need better judicial recourse to hold perpetrators accountable.
Article content
Gendered violence: another year, another way to perpetuate the cycle. What is the latest trend? The creation of non-consensual deepfakes.
Deepfakes are pictures or videos that have been created or altered with the use of artificial intelligence. While such technology is not inherently evil, it sometimes finds itself in the hands of users who cannot or, will not, see the moral issue at stake. Unsurprisingly, the majority of deepfake content found online is non-consensual pornography, and the majority of it targets women.
Advertisement 2
Article content
Article content
Canada needs better legal recourse for victims of non-consensual deepfakes to hold perpetrators accountable.
It’s becoming increasingly clear that deepfakes are having similar impacts to what’s called “non-consensual distribution of intimate images” (NCDII). Non-consensual deepfakes are being used as a tool by abusers to control their victims. A study says that, similarly to revenge porn, “deepfakes are used to control, intimidate, isolate, shame and micromanage victims,” mostly women.
Victims of deepfakes also experience constant anxiety about who has viewed this content or when they might see it next. Even if the content is taken down, it may already have been shared or saved to personal devices.
The new Online Harms Act sheds light on the issue of deepfakes. This Act holds promise with the creation of the Digital Safety Commission, which will work in tandem with social media platforms to restrict the proliferation of deepfake content. Platforms will need to implement tools to flag harmful content. More importantly, content that is deemed to be harmful, such as NCDII, will be taken down within 24 hours.
Advertisement 3
Article content
While this is a step in the right direction, we are still not holding perpetrators accountable.
The Online Harms Act did not introduce any changes to the Criminal Code. However, the quality of deepfakes has improved, making it difficult to discern from non-digitally altered pictures, and victims are facing the same consequences as with NCDII. Non-consensual deepfake of a sexual nature should carry the same judicial consequences as with NCDII. We could amend our Criminal Code to criminalize non-consensual deepfakes of a sexual nature.
It would be ideal to implement a new tort that recognizes deepfakes as a social and ethical wrong. This tort could be applied when a defendant distributes non-consensual deepfakes of the plaintiff. A perpetrator should not be able to use the defence that they used media voluntarily uploaded online by the plaintiff. The issue at stake is how these once-consensual images are being used.
The only way to avoid becoming a victim of deepfakes is to limit your online presence. Not having an online presence can be disadvantageous in today’s reality. It would also place the responsibility on victims, mainly women, instead of condemning perpetrators.
Advertisement 4
Article content
Because the content that is used to create deepfakes has likely been uploaded online consensually, creators of deepfakes might believe that they’re not doing anything wrong, or they simply don’t care. In reality, this type of content is harming women and is just a new way of perpetuating gendered violence.
While, ideally, it would be best to stop the creation and sharing of deepfake before it happens, it can be very difficult, if not impossible. Victims need access to better judicial recourse to hold perpetrators accountable.
Katheryne Soucy is a J.D. candidate (2025) at the University of Ottawa.
Recommended from Editorial
Article content