Résumés
Abstract
This article explores the evolution of Canadian criminal and civil responses to non-consensual synthetic intimate image creation and distribution. In recent years, the increasing accessibility of this type of technology, sometimes called deepfakes, has led to the proliferation of non-consensually created and distributed synthetic sexual images of both adults and minors. This is a form of image-based sexual abuse that law makers have sought to address through criminal child pornography laws and non-consensual distribution of intimate image provisions, as well as provincial civil intimate image legislation. Depending on the province a person resides in and the age of the person in the image, they may or may not have protection under existing laws. This article reviews the various language used to describe what is considered an intimate image, ranging from definitions seemingly limited to authentic intimate images to altered images and images that falsely present the person in a reasonably convincing manner.
Résumé
Cet article explore l’évolution des réponses pénales et civiles canadiennes à la création et à la distribution d’images intimes synthétiques non consensuelles. Ces dernières années, l’accessibilité croissante de ce type de technologie, parfois appelée « deepfakes », a conduit à la prolifération d’images sexuelles synthétiques d’adultes et de mineurs créées et distribuées sans consentement. Il s’agit d’une forme d’abus sexuel par l’image que les législateurs ont cherché à combattre en adoptant des lois pénales sur la pornographie juvénile et des dispositions sur la distribution non consensuelle d’images intimes, ainsi que des lois civiles provinciales sur les images intimes. Selon la province dans laquelle une personne réside et l’âge de la personne figurant sur l’image, elle peut ou non bénéficier d’une protection en vertu des lois existantes. Cet article passe en revue les différents termes utilisés pour décrire ce qui est considéré comme une image intime, allant de définitions apparemment limitées à des images intimes authentiques à des images modifiées et à des images qui présentent faussement la personne d’une manière raisonnablement convaincante.
Corps de l’article
Introduction
In January 2024, X (formerly Twitter) was flooded with sexual images of Taylor Swift.[1] These images were created using artificial intelligence (AI) image generators without Swift’s consent.[2] Some of the most well-known AI image generators, including the one used to create the images of Swift, have attempted to curb or prevent the generation of celebrity and sexual images through their products altogether.[3] However, a group of 4chan users challenged each other to find a way to bypass these prompt guardrails and were eventually able to generate the intimate images of Swift that were later widely shared on X.[4] The images were viewed millions of times, and the non-consensual creation and distribution of these images was widely condemned in the media and by some governments.[5] Although this story brought newfound energy to the debate about regulating non-consensual synthetic intimate images (NSII; i.e., sexual images of a person that were created using technology such as AI or Photoshop without their consent), celebrities like Swift have had sexual deepfakes made of them for years without their consent with little legal recourse.[6] As the technology became more accessible and diverse, the breadth of who was targeted also expanded. In today’s world, we see a wide swath of people targeted by NSII, ranging from celebrities to schoolgirls.[7]
Nationally and internationally, there have been various calls to regulate synthetic media.[8] This article does not address the full scope of potential harms that synthetic media in general may cause; its analysis will be limited to various forms of NSII.[9] While it is important that lawmakers address the broader violations associated with synthetic media,[10] NSII requires a different legal analysis compared to other forms of synthetic imagery, because these images contain sexual content that involves the uniquely sensitive issues of sexual integrity and privacy.[11] This article examines some of the existing and proposed legal solutions related to NSII in Canada and discusses the evolving definition of intimate images.
Part I provides an overview of NSII. It describes the meaning of this term, how it fits into the larger umbrellas of image-based sexual abuse (IBSA) and the harms caused by NSII that call for a legal remedy. Part II provides a brief overview of existing and proposed laws in Canada that address NSII. It then examines how the definition of “intimate images” in Canada’s criminal and civil intimate image laws could be inclusive of NSII and discusses some challenges that may arise with various existing and proposed definitions.
I. Overview of Non-Consensual Synthetic Intimate Images and their Harms
A. What Are Non-Consensual Synthetic Intimate Images?
Non-consensual synthetic intimate images (NSII) are intimate images of a person that are created using technology such as AI or Photoshop without the consent of the person featured in them.[12] Synthetic media refers to any form of media that has been digitally manipulated or created to represent something that does not exist in reality, often with the use of AI.[13] This type of media is created using face-swapping technology, such as replacing a person’s face in an existing pornography video or superimposing their face on a live sex video; image manipulation, which adds or changes the information in an image, such as by making a clothed person appear nude; or generative AI, which is used to create entirely new images where a person appears nude or engaged in sexual activity.[14] This technology can be used for creative and positive sexual purposes. However, when used without the consent of the person represented in the image, it can cause harm worthy of legal intervention. NSII includes images where a person is depicted as nude or semi-nude, exposing their genitals, anal region, or breasts, or engaged in sexually explicit activity. Crucially, NSII are made or distributed without that person’s consent. They can be a form of image-based sexual abuse.[15]
Deepfake videos are one of the most well-known forms of synthetic media.[16] Deepfakes are videos or images in which AI is used to alter the content that appears in the images.[17] Deepfakes originated as a form of AI that could swap faces in videos, but the term “deepfake” has been used to describe many forms of synthetic media that misrepresents something or someone using digital media.[18] Typically, a non-consensual sexual deepfake is a video where a person’s face is superimposed on a previously existing pornographic video, resulting in fairly realistic footage that appears as though a person is engaging in sex acts that they did not actually perform.[19] As discovered by journalist Samantha Cole, deepfake technology was popularized in 2017 after a Reddit user posted sexual deepfakes he had made of several famous female celebrities on a Reddit board.[20] By 2023, hundreds of thousands of deepfake videos existed online.[21] Of the publicly available sexual deepfakes, the vast majority are made without the consent of the person in the image and they almost exclusively feature women.[22] Recent studies by Umbach et al. and Flynn et al. also show noteworthy self-reported rates of victimization among men as well,[23] although images featuring men seem less likely to appear online publicly. These studies also show that men are more likely to create and consume sexual deepfakes.
Other forms of AI and digital technologies have been used to create NSII, such as “nudifying” apps that transform still images of fully clothed women or girls into photos where they appear to be fully nude,[24] generative AI used to create entirely new sexual images,[25] or older technology, like photo editing software, that can merge or edit photos of a person to make it falsely appear that they are naked or engaging in sex acts.[26]
Despite many social media and pornography websites banning NSII,[27] the popularity of these images has not waned.[28] Wired reported that an independent researcher found that nearly 250,000 sexual deepfakes were uploaded onto thirty-five of the most popular deepfake pornography websites over the last seven years, with more than 100,000 being uploaded in 2023 alone.[29] In 2024, McGlynn reported that one of the most popular deepfake websites was accessed around 17 million times per month.[30] In 2020, Sensity reported that over 100,000 images of women were artificially stripped nude and shared publicly on Telegram, 70% of these images were taken from social media profiles or used private images of private individuals.[31] These images are sometimes called “deepnudes.” In 2023, Graphika identified over 24 million unique views on thirty-four NSII providers’ websites, showing the growing interest in the technology.[32] As such, the use of this type of technology is becoming more widespread. Further, between 2023 and 2024, news reports emerged of male high school students using this type of technology to nudify images of their female classmates in Winnipeg and London, Ontario.[33] A 2024 New York Times article described the prevalence of this behaviour as an epidemic confronting teen girls.[34] Today, with the advent of AI image generators, users can now create fully new images of people, rather than having to swap out their face or body and those images.[35]
B. Harms of Non-Consensual Synthetic Intimate Images
Research has shown that there are harms associated with NSII. McGlynn and Rackley, and Flynn et al. identify NSII as a form of image-based sexual abuse (IBSA).[36] They define IBSA as the non-consensual creation, distribution, or threat to distribute nude or sexual images of another person, which includes acts such as voyeurism, sexual extortion, and the non-consensual creation and distribution of intimate images, synthetic or real. One of the most extensive studies on IBSA[37] by Henry et al. researched people who had experienced various forms of IBSA, including those who were victims/survivors of NSII. The number of participants in the study who had their images synthetically altered was relatively low compared to other more common forms of IBSA, such as the non-consensual distribution of actual intimate images of a person. However, participants who had experienced either of these forms of abuse reported similar harms, including social rupture, ongoing harms when the images were shared or viewed repeatedly, ongoing fear that the abuse will reoccur, social isolation, and lost freedom, including the ability to trust.[38] The majority of participants found IBSA to be harmful, whereas a smaller percentage reported feeling neutral or even some positive feelings, such as finding the non-consensual use of their images funny or feeling flattered when their images were created or shared.[39]
The research by Henry et al. is an example of the emerging empirical evidence documenting the negative impacts of NSII. Its data has led to scholarship that examines the data from this study on NSII in particular.[40] Additional recent research shows that many people recognize NSII as a social wrong requiring legal intervention. Umbach et al. surveyed over 16,000 people across ten countries about their experiences with deepfakes. The results of their research demonstrated that the majority of participants reported the non-consensual creation and distribution of deepfakes as harmful, with many supporting legal intervention.[41] An American study by Kugler and Pace noted that their participants rated the distribution of non-consensual sexual deepfakes as highly morally blameworthy, even when clearly labeled as fake.[42] Participants from two UK studies by Fido et al. reported that deepfakes were especially harmful when the images were shared publicly rather than used privately, featured women instead of men, or featured people that participants knew personally compared to celebrities.[43] Participants from these studies generally supported a criminal or civil legal response to non-consensual sexual deepfakes.
Despite research concluding that behaviour tied to the distribution of NSII has been recognized as harmful, some non-consensual deepfake creators, consumers, and researchers argue that these images cause few significant harms and that they constitute a form of legitimate sexual fantasy and technological experimentation.[44] Newton and Stanfill found that some deepfake creators fail to see the subjects of their creations as fully human but rather view the images as digital objects to be used in their exploration of the technology.[45] Other deepfake creators, consumers, and researchers claim that these images cause no harm, because they are not real representations of their subject. Further, Öhman argues that the non-consensual creation of sexual deepfakes can be deemed morally permissible when considered individually and are somewhat comparable to sexual fantasies, but may be morally impermissible when considered on the grander scale.[46] In an unconventional argument, Ganesh posits that there may be benefits to keeping some forms of sexual deepfakes of children legal as these images may provide an alternative to child sexual abuse material involving real children and thus reduce the number of actual children being sexually abused.[47] Despite those beliefs, non-consensual sexual deepfakes do cause harm to some of the people featured in them.
Although widespread empirical research is still lagging due to the relatively recent development of this technology, abundant evidence drawn from the lived experience of people targeted by NSII shows the significant emotional, reputational, professional, and financial harms caused.[48] As noted by Bailey and Dunn, NSII is simply one of the newest forms of technology-facilitated gender-based violence that have occurred in digital spaces since the advent of the internet.[49] With each development of new forms of technology, abusers seem to find a way to use them to sexually violate others against their will.
When abusers use technology to cause sexual harms, Bailey and Mathen state that the creator “instrumentalizes” the person they target “using her to achieve his own goals and sublimating her will to his.”[50] An individual’s sexual integrity should be in their control, even in digital contexts. That control is lost when someone makes NSII of them.[51] As argued by Citron, a person’s sexual integrity and privacy is deeply impacted by non-consensual sexual deepfakes that “hijack people’s sexual and intimate identities. ... creating a sexual identity not of the individual’s own making.” [52]
People should have a right to control the sexual boundaries of their digital selves in a similar way that they do with their physical bodies. Unlike personal sexual fantasies, which remain within the mind of the individual, NSII are a real-world manifestation of sexual fantasy that alters the balance of the rights of the people involved. Regardless of an observer’s sexual interest in another person’s body or nude images, the person whose body or image is used to produce NSII should retain control over their sexual experiences, including determining who touches or views intimate images of them. Further, they should have control over whether their images can be used to train and improve generative AI tools used for a sexual purpose. One person’s sexual interest and feelings of sexual entitlement or curiosity should not trump the sexual integrity of the other person. With the increasing prevalence of NSII and the documentation of their harms, it is important for law makers to consider what types of images should be included in intimate image laws.
II. Legal Responses
While several Canadian laws that address NSII directly are already in force, much room for legislative improvement remains. Notably, there are gaps in some of those laws, as not all civil or criminal protections cover synthetic images, and the definition of “intimate images” is inconsistent in legislation across the country. It will take time, research, and some experimentation to develop a law that properly addresses NSII. However, these efforts are worthwhile. NSII are a form of sexual wrongdoing that should be regulated to protect the sexual integrity, privacy, and digital identities of the individuals they feature. People should have the right to control their sexual expression in digital spaces, including limiting the distribution of realistic sexual images of themselves. Those targeted by NSII should have access to effective legal tools to prevent and remedy the harms associated with the breach of this right.
Of course, comprehensively addressing the harms of NSII will require more than just well-crafted legal responses. As noted in previous Canadian research on technology-facilitated gender-based violence and image-based sexual abuse, a holistic approach that includes legal, technical, social, educational, and community based efforts is essential to responding to and preventing these types of harms, particularly by shaping social norms about IBSA.[53] This article focuses on some of the legal responses to NSII, while recognizing that these are only a small piece of the puzzle.
This Part provides a brief overview of some of the existing laws in Canada that address intimate images, including some that address NSII, as well as two new bills that propose to regulate NSII in the future.[54] It then examines current and proposed definitions of prohibited images and explores some of the challenges that may arise in applying NSII to these definitions.
A. Existing and Proposed Laws
In Canada, a variety of laws prohibiting the distribution of intimate images of adults and children have been introduced across the country. Some of the definitions of prohibited images are expansive enough to capture NSII, while others are not.[55] The exclusion of NSII in some of these laws is in part due to how the technology used to create NSII had either not been invented or was not widely used to create NSII at the time of legislative drafting. However, as NSII became more commonplace and concerns about the adequacy of Canadian law to respond to NSII grew, so did the definition of prohibited intimate images.[56] For example, certain provinces began including “altered” in their definition of intimate images, which could capture forms of NSII such as deepfakes and nudifying apps.[57] Other legislation, such as the recently introduced federal Bill C-63, the Online Harms Act,[58] and Manitoba’s Bill 24, the Intimate Image Protection Amendment Act (Distribution of Fake Intimate Images),[59] include even broader definitions of prohibited intimate images that could be widely inclusive of many forms of NSII as long as the images are reasonably convincing.
This article examines statutory laws that explicitly address intimate images. These are not the only laws available for people targeted by NSII.[60] For those wishing to pursue a legal remedy for NSII, there are a variety of existing civil, criminal, intellectual property, and human rights laws in Canada that could apply to NSII, allowing for complaints regarding extortion, defamation, privacy, or copyright.[61] Additionally, leading scholars on IBSA, such as Laidlaw,[62] Citron,[63] Eaton and McGlynn,[64] have called for more general privacy laws that would protect the privacy of targets of IBSA and NSII. However, this article does not examine those laws and limits its analysis to legislation that directly addresses distributing intimate images.
1. Criminal Law: Child Pornography and the Publication of an Intimate Image Without Consent
Canada’s Criminal Code addresses prohibited intimate images under its provisions on child pornography and the publication of intimate images without consent.[65] The child pornography provision prohibits making, distributing, possessing, and accessing “a photographic, film, video or other visual representation, whether or not it was made by electronic or mechanical means” that shows or depicts a person under the age of eighteen engaged in sexual activity, as well as images depicting the sexual organ or anal region of a person under the age of eighteen for a sexual purpose.[66] This definition includes both real and artificial intimate images of children and is inclusive of some forms of NSII.
The non-consensual intimate images provision of the Criminal Code prohibits the distribution of intimate images. It defines an intimate image as “a visual recording of a person made by any means including a photographic, film or video recording” where a person is engaged in sexual activity or is nude, exposing his or her genital organs or anal regions or her breasts, where the person would have a reasonable expectation of privacy at the time the recording was made and at the time of distribution.[67] Although a case involving NSII has yet to test this definition to see if it could be interpreted to include NSII, on a plain reading, it appears to only include authentic intimate images of a person and would therefore not capture NSII.
To date, there are at least two reported criminal cases in Canada where a person was prosecuted under the child pornography provision for making non-consensual sexual deepfakes. However, there are other child pornography cases involving simpler technological means, such as Photoshop.[68] In the 2023 case of R v. Larouche, the accused created sexual deepfakes of children and included them in his collection of real child pornography. He was subsequently convicted of making, possessing, and distributing child pornography.[69] In the 2024 case of R v. Legault, a youth pastor pleaded guilty to making and possessing child pornography.[70] He possessed a collection of child pornography, including at least one image of a teen girl he had “nudified” using the DeepNude app. He had digitally altered an image of a second girl’s face to appear on the body of a nude child. Legault was also in possession of 150 photos of children that the police suspected Legault planned to input into the DeepNude application. These are some of the first NSII child pornography cases, but they will not be the last. With the increase in deepfakes and other forms of synthetic media being used for child sexual abuse material,[71] the number of successful criminal cases involving child-based NSII will likely increase. However, the criminal law likely does not protect adults targeted by NSII. To date, there have been no reported criminal cases that the author is aware of involving NSII and laws surrounding the non-consensual distribution of intimate images.
2. Civil Law: Intimate Image Statutes
In most Canadian provinces, civil intimate image statutes have been introduced over the last decade to provide a civil remedy when intimate images are shared without consent. Of the common law jurisdictions in Canada, only Ontario and the territories have not introduced specific statutory civil intimate image laws.[72] In provinces where intimate image statutes are in force, several include a definition of intimate images that is similar to that in the Criminal Code, including Alberta,[73] Manitoba,[74] Newfoundland and Labrador,[75] and Nova Scotia.[76] Like the Criminal Code, these provincial definitions seem to limit their application to actual intimate images of a person. However, there may be some possibility for the courts to interpret those definitions to be inclusive of NSII. Following recommendations made by Laidlaw and Young[77] and the Uniform Law Conference of Canada,[78] several provinces introduced intimate images statutes that defined “intimate images” more broadly to be clearly inclusive of deepfakes. British Columbia,[79] New Brunswick[80], Prince Edward Island,[81] and Saskatchewan[82] defined intimate images to include images that have been altered to depict a person engaged in sexual activity, nude (and in some cases such as British Columbia or Prince Edward Island, nearly nude), or exposing their genital organs, anal region or breasts. The definitions in these provinces further require that the person had a reasonable expectation of privacy at the time of the relevant recording, distribution—or, in British Columbia and New Brunswick—simultaneous recording or live stream. These definitions clearly include forms of NSII where an image was altered to depict an intimate image, such as deepfakes and nudified images; however, it remains to be seen whether they are broad enough to capture fully generated images, like those made of Swift, where there is not a clear original image that was altered. Considering the meaning and purpose behind this new definition of intimate image, it seems reasonable that the courts could and should interpret these types of images as altered intimate images. However, there is a risk that some may not. Generative AI technology requires original images of a person in order to generate novel depictions in their likeness. Therefore, the generated images could arguably be considered altered images, per some civil intimate image statutes. To date, no cases involving NSII under any of these acts have been reported.
In March 2024, Manitoba introduced Bill 24, The Intimate Image Protection Amendment Act (Distribution of Fake Intimate Images). It sought to revise the definition of intimate images to include fake intimate images, which would encompass any type of visual recording “that, in a reasonably convincing manner, falsely depicts an identifiable person” engaging in explicit sexual activity or as being nude, or exposing their genital organs, anal region, or breasts.[83] The definition further specifies that this includes images created through “the use of software, machine learning, artificial intelligence or other means, including by modifying, manipulating or altering an authentic visual representation” and that “it is reasonable to suspect that the person depicted in the image would not consent to the recording being made or distributed to others.”[84] These false depictions are distinguished from a “personal intimate image” of a person, which is an authentic intimate image where a person would have a reasonable expectation of privacy at the time the image was recorded and distributed.
This bill has since come into force. Manitoba is the first province to update its intimate image statute to include this broader definition of intimate images.[85] Interestingly, there is no reasonable expectation of privacy requirement in the definition of a fake image. Instead, the definition focuses on whether the person would have consented to the distribution or not.
3. Federal Online Harms Bill
In 2024, the federal government introduced Bill C-63, enacting the Online Harms Act, which aims in part to regulate social media companies. It creates a special duty requiring social media platforms to make content that sexually victimizes a child, re-victimizes a survivor, or is intimate and communicated without consent inaccessible to all persons in Canada.[86] The bill’s definition of content that sexually victimizes a child or revictimizes a survivor includes a visual representation where a child or someone who is depicted as being a child is engaged in explicit sexual activity, or an image depicting a child’s sexual organs or anal region for a sexual purpose, as well as some other sexualized images involving children.[87] Like the Criminal Code’s child pornography provisions, this definition is broad enough to capture real or synthetic images. It does not include any additional requirements pertaining to a reasonable expectation of privacy or consent to the communication of the images. The bill’s definition of intimate content includes actual images and images that “falsely presents in a reasonably convincing manner” a person engaged in sexual activity, nude, or exposing their sexual organs or anal region. [88] Deepfakes are explicitly named as a type of image that would fit within this definition, which is likely broad enough to capture other forms of NSII, so long as they are “reasonably convincing.” For false images, there must be reasonable suspicion that its subject did not consent to the image being communicated. Both the federal and Manitoba definitions move away from the reasonable expectation of privacy standard for NSII to a standard based on consent to distribution. Neither address the non-consensual creation of these images.[89]
B. Defining Intimate Images
To address NSII, governments could consider expanding their definition of intimate images to include synthetic or false but reasonably convincing intimate images, as Manitoba did and as the federal government has proposed. Over the last decade, the evolution of intimate image laws has shifted in some areas of law from a definition seemingly limited to authentic intimate images to one that includes altered images. Today, there are proposals to include false images that are reasonably convincing in some intimate image laws. These evolving definitions have aligned with the technological and social circumstances at the time of their legislative introduction. Early laws reacted to actual intimate images released without consent; these were followed by newer legislation that restricted the distribution of altered images due to the rising popularity of deepfakes and nudifying apps. Now, with generative AI, the definition of intimate images in Manitoba and the proposed definition in the Online Harms Act have expanded even further to include intimate images that are reasonably convincing but falsely depict someone nude or engaged in a sexual act. The following section discusses some of the challenges that will arise with the development of the definition of intimate images to include synthetic or fake but reasonably convincing images under both criminal and civil non-consensual intimate image sharing laws.
The definition of “altered” intimate images in many civil intimate image laws was introduced to address deepfakes. One would hope that this definition will be interpreted to include novel images such as those created using generative AI, but this definition raises some potential risks due to the suggestion that it may require an original image that was altered in order to be captured under the statute. This definition would likely include face swapping deepfakes, nudifying images, and photoshopped images where the original images are identifiable in the final image. However, the definition of “altered” intimate images creates a risk that it may not capture cases such as those involving generative AI[90] or hyper-realistic avatars created for virtual reality sexual simulations,[91] where an original image may not be easily identifiable or exist at all. Further, given how it is increasingly possible to create a realistic digital image of a person without an original photo of them as a base, lawmakers must be alert to these technological advancements when crafting laws addressing the evolving harm presented by AI and hyper-realistic depictions. The potentially underinclusive definition of “altered” intimate images could also pose evidentiary hurdles if the original altered image must be identified or produced within the context of an already complex evidentiary area of law.[92]
If these laws are meant to protect individuals from realistic fake intimate images, the definition of “altered” must either be interpreted broadly enough to capture realistic images of a person whether an identifiable original image was altered or not, or a more encompassing term should be used to capture NSII, such as synthetic images or fake images that are reasonably convincing.
As intimate images laws move towards a definition of reasonably convincing false images, the line between what type of content falls within this definition and what does not is sure to be contested. It is important to note that not all non-consensual sexual representations should be captured by intimate image laws, and that certain defenses should be available.[93] For example, a crudely drawn digital image of a person depicted engaging in sexual activity should not fit in these definitions of regulated intimate images. Such a portrayal may be insulting or offensive, but it should not cross the threshold into regulated images. Alternatively, many NSII are realistic but of such poor quality that an unaided observer can identify them as fake. Notably, many hyper-realistic NSII are explicitly labeled to inform the viewer that they are fake, such as those featured on deepfake porn websites that clearly state that the videos are deepfakes, or are understood as fake due to the celebrity status of the person featured in them.[94] Despite this, these images can still cause cognizable harms, including sexual integrity harms, particularly when distributed.[95] The line between what types of images capture the plaintiff’s likeness and purport to depict reality and which do not will need to be thoughtfully addressed by lawmakers and the courts. This crucial analysis should focus on upholding the sexual integrity of the image’s subject.
Deciding which altered images and which false images that are reasonably convincing should fit within legal regulation will be challenging.[96] For example, how should lawmakers categorize very realistic altered images that do not purport to depict reality, such as a high quality deepfake of a person who is depicted in a sexually graphic scene set in outer space? The context of the image may not depict reality, thus impacting the consideration of whether it is reasonably convincing. However, the images of the subject’s face and body may look perfectly real and harm their sexual integrity if shared. As such, placing a realistic intimate image in an unrealistic context should not be enough to exclude the depiction from the definition of intimate image. Alternatively, where an image is in a realistic setting but is of lower quality, the level of what is reasonably convincing will also be challenged. Such could be the case of a non-consensual sexual deepfake that is a bit glitchy or an AI generated image that has the overly shiny quality that is typical of them. These images may signal to the viewer that they are digitally created and not real, but small technical flaws should not be enough to exclude them from this definition so long as the person is identifiable and the image has a clear realism to it. Fundamentally, the threshold for legal regulation should not to be so high as to exclude the majority of realistic but still clearly digitally created NSII when examined closely and not so low to include any artistic rendering of a nude person, such as a crude line drawing or unrealistic cartoon.
The expansion of civil non-consensual distribution of intimate image laws and federal social media content moderation laws to include NSII are less controversial than introducing criminal provisions. Many agree that laws supporting swift orders for takedowns and the deletion of the non-consensually distributed intimate images should be introduced.[97] Conversely, the criminalization of these images is highly debated.[98] As with all criminal laws, there is a need for greater scrutiny when considering criminalizing NSII due to the significant risks to liberty and potential human rights challenges related to freedom of expression that could arise. Additionally, many argue that a carceral approach has been ineffective in addressing sexual harms in Canada[99] and should not be used for what some argue are expressive behaviours.[100] Others support a criminal approach as a remedy for targets of this type of abuse.[101] Although the criminal justice system presents significant systemic barriers to assisting victims of IBSA[102] and should not be the primary source of legal intervention,[103] scholars such as Mathen argue that criminal law plays an important expressive function in recognizing sexual wrongs that engage in blameworthy sexual objectification leading to individual and systemic harms, such as forms of IBSA.[104] Since their introduction in Canada, criminal law provisions have proven to be an effective tool for some people targeted by specific forms of IBSA, such as victims of voyeurism and the non-consensual distribution of intimate images.[105] If the criminal law definition of intimate images is expanded to include NSII, it may require a deeper scrutiny of which behaviours would be captured by the criminal provision and what punishments would be appropriate.
Conclusion
This article reviews one possible option for addressing NSII: broadening the language of criminal and civil intimate images laws to include NSII in legal regulation. It does not suggest that this is the only or even the best solution to address NSII. However, with the proliferation of sexualized generative AI, it is one worth considering. Assessing the effectiveness of the legal and non-legal approaches used to fully address these harms in response to technological development will take time. While solutions addressing NSII are developing, it is undeniable that NSII causes real harms to many of the individuals featured in them and that legal action is needed. Legal intimate image protections should cover both real and synthetic images, ensuring that people have effective rights and supports to protect their sexual integrity, digital or otherwise.
Parties annexes
Notes
-
[1]
Samantha Cole & Emanuel Maiberg, “The Taylor Swift Deepfakes Disaster Threatens to Change the Internet as We Know It”, 404 Media (31 January 2024), online: <404media.co> [Cole & Maiberg, “Taylor Swift Deepfakes”] [perma.cc/V3SX-NN6K].
-
[2]
Suzie Dunn & Kristen Thomasen, “Taylor Swift May Speak Now About Sexual Deepfake Images. But That’s Not Enough”, The Globe and Mail (1 February 2024), online: <theglobeandmail.com> [perma.cc/65LP-VKTS].
-
[3]
Matt O’Brien & Haleluya Hadero, “AI Image-Generators Are Being Trained on Explicit Photos of Children, a Study Shows”, Associated Press (20 December 2023), online: <apnews.com> [perma.cc/NB38-6MHM].
-
[4]
4chan is a controversial imageboard website known for hosting problematic content, including harassing content and image-based abuse. Emanuel Maiberg & Samantha Cole, “AI-Generated Taylor Swift Porn Went Viral on Twitter. Here’s How It Got There”, 404 Media (25 January 2024), online: <404media.co> [perma.cc/SE5M-XGN9]; Tiffany Hsu, “Fake and Explicit Images of Taylor Swift Started on 4chan, Study Says”, The New York Times (5 February 2024), online: <nytimes.com> [perma.cc/5ZMX-LVYU].
-
[5]
“White House ‘Alarmed’ by AI Deepfakes of Taylor Swift”, (26 January 2024), online (video): <washingtonpost.com> [perma.cc/A6G9-B8F6]; Saba Eitizaz, “Taylor Swift and the Dystopian World of AI Deepfakes”, Toronto Star (15 February 2024), online (podcast): <thestar.com> [perma.cc/M4ZD-KFJK].
-
[6]
Samantha Cole, “AI-Assisted Fake Porn Is Here and We’re All Fucked”, Vice (11 December 2017), online: <vice.com> [perma.cc/EU6F-D63V] [Cole, “AI-Assisted Fake Porn”].
-
[7]
Darren Bernhardt, “AI-Generated Fake Nude Photos of Girls from Winnipeg School Posted Online”, CBC News (15 December 2013), online: <cbc.ca> [perma.cc/D9S6-P86E].
-
[8]
See e.g. Bobby Chesney & Danielle Citron, “Deep Fakes: A Looming Challenge for Privacy, Democracy, and National Security” (2019) 107:6 Cal L Rev 1753; Britt Paris & Joan Donovan, Deepfakes and Cheap Fakes: The Manipulation of Audio Visual Evidence (Data & Society, 2019) at 8; Sarah Alex Howes, “Digital Replicas, Performers’ Livelihoods, and Sex Scenes: Likeness Rights for the 21st Century” (2019) 42:3 Colum J L & Arts 345; Asher Flynn, Jonathan Clough & Talani Cooke, “Disrupting and Preventing Deepfake Abuse: Exploring Criminal Law Responses to AI-Facilitated Abuse” in Anastasia Powell, Asher Flynn & Lisa Sugiura, eds, The Palgrave Handbook of Gendered Violence and Technology (Cham, Switzerland: Palgrave Macmillan, 2021) 583 at 596.
-
[9]
NSII is defined in this article: Rebecca Umbach et al, “Non-Consensual Synthetic Intimate Imagery: Prevalence, Attitudes, and Knowledge in 10 Countries” in Florian Floyd Mueller et al, eds, CHI ‘24: Proceedings of the CHI Conference on Human Factors in Computing Systems (New York: Association for Computing Machinery, 2024).
-
[10]
Suzie Dunn, “Identity Manipulation: Responding to Advances in Artificial Intelligence And Robotics” (Paper prepared for the We Robot 2020 conference, Ottawa, 2–4 April 2020) [unpublished], online (pdf): <digitalcommons.schulichlaw.dal.ca> [perma.cc/ W9RE-A834] [Dunn, “Identity Manipulation”].
-
[11]
Danielle Keats Citron, “Sexual Privacy” (2019) 128:7 Yale LJ 1870 at 1898–99 [Citron, “Sexual Privacy”].
-
[12]
Umbach et al, supra note 9 at 1.
-
[13]
Henry Ajder & Joshua Glick, Just Joking! Deepfakes, Satire, and the Politics of Synthetic Media (Witness Media Lab, Co-Creation Studio & MIT Open Documentary Lab).
-
[14]
Dunn, “Identity Manipulation”, supra note 10; Rumman Chowdhury, “Your Opinion Doesn’t Matter, Anyways”: Exposing Technology-Facilitated Gender-Based Violence in an Era of Generative AI (Paris: UNESCO, 2023) at 10–11.
-
[15]
Clare McGlynn, Erika Rackley & Ruth Houghton, “Beyond ‘Revenge Porn’: The Continuum of Image-Based Sexual Abuse” (2017) 25:1 Fem Leg Stud 25 at 33–34.
-
[16]
Summary of Discussions and Next Step Recommendations from “Mal-uses of AI-generated Synthetic Media and Deepfakes: Pragmatic Solutions Discovery Convening” (Brooklyn: Witness, 2018).
-
[17]
Citron, “Sexual Privacy”, supra, note 11 at 1921.
-
[18]
Jacquelyn Burkell & Chandell Gosse, “Nothing New Here: Emphasizing the Social and Cultural Context of Deepfakes” (2019) 24:12 First Monday; Hany Farid, “Creating, Using, Misusing, and Detecting Deep Fakes” (2022) 1:4 J Online Trust & Safety 1.
-
[19]
Chidera Okolie, “Artificial Intelligence-Altered Videos (Deepfakes): Image-Based Sexual Abuse, and Data Privacy Concerns” (2023) 25:2 J Intl Women’s Studies 1 at 7–8.
-
[20]
Cole, “AI-Assisted Fake Porn”, supra note 6; Lux Alptraum, “Deepfake Porn Harms Adult Performers, Too”, Wired (15 January 2020), online: <wired.com> [perma.cc/S4TJ-F2SL]; Nicola Henry et al, Image-Based Sexual Abuse: A Study on the Causes and Consequences of Non-Consensual Nude or Sexual Imagery (London: Routledge, 2020) at 96; Samantha Cole, How Sex Changed the Internet and the Internet Changed Sex: An Unexpected History (New York: Workman Publishing, 2022) [Cole, “How Sex Changed the Internet”].
-
[21]
Matt Burgess, “Deepfake Porn Is Out of Control”, Wired (16 October 2023), online: <wired.com> [perma.cc/6D8B-Y5A7].
-
[22]
Henry Ajder et al, The State of Deepfakes: Landscape, Threats and Impact (Deeptrace, 2019) at 2.
-
[23]
Umbach et al, supra note 9 at 9–10; Asher Flynn et al, “Deepfakes and Digitally Altered Imagery Abuse: A Cross-Country Exploration of an Emerging form of Image-Based Sexual Abuse” (2022) 62:6 Brit J Crim 1341 at 1350, 1355 [Flynn et al, “Deepfakes”].
-
[24]
Matthew Hall, Jeff Hearn & Ruth Lewis, “Image-Based Sexual Abuse: Online Gender-Sexual Violations” (2023) 3:1 Encyclopedia 327 at 334.
-
[25]
Cole & Maiberg, “Taylor Swift Deepfakes”, supra note 1.
-
[26]
McGlynn, Rackley & Houghton, supra note 15 at 33; Paris & Donovan, supra note 8 at 25.
-
[27]
Sophie Maddocks, “‘A Deepfake Porn Plot Intended to Silence Me’: Exploring Continuities Between Pornographic and ‘Political’ Deep Fakes” (2020) 7:4 Porn Studies 415 at 417, 419.
-
[28]
Home Security Heroes, “2023 State of Deepfakes Realities, Threats, and Impact” (last visited 31 August 2024), online: <homesecurityheroes.com> [perma.cc/VQ56-QD9B].
-
[29]
Burgess, supra note 21.
-
[30]
Clare McGlynn, “Deepfake Porn: Why We Need to Make It a Crime to Create It, Not Just Share It”, The Conversation (9 April 2024), online: <theconversation.com> [perma. cc/VJ93-2BMT].
-
[31]
Henry Ajder, Giorgio Patrini & Francesco Cavalli, “Automating Image Abuse: Deepfake Bots on Telegram” (October 2020) at 2, online (pdf): <stareintothelightsmypretties. jore.cc> [perma.cc/4AY5-CB5X].
-
[32]
Santiago Lakatos, “A Revealing Picture: AI-Generated ‘Undressing’ Images Move from Niche Pornography Discussion Forums to a Scaled and Monetized Online Business” (December 2023), online: <graphika.com> [perma.cc/ DJQ9-H8ER].
-
[33]
Bernhardt, supra note 7; Jessica Wong, “Amid Rise in AI Deepfakes, Experts Urge School Curriculum Updates for Online Behaviour”, CBC News (9 January 2024), online: <cbc.ca> [perma.cc/DY75-L4CM].
-
[34]
Natasha Singer, “Teen Girls Confront an Epidemic of Deepfake Nudes in Schools”, The New York Times (8 April 2024), online: <nytimes.com> [perma.cc/2UAF-3VPL].
-
[35]
Chowdhury, supra note 14 at 10.
-
[36]
Clare McGlynn & Erika Rackley, “Image-Based Sexual Abuse” (2017) 37:3 Oxford J Leg Stud 534 at 534; Flynn et al, “Deepfakes”, supra note 23 at 1342.
-
[37]
Henry et al, supra note 20.
-
[38]
Ibid at 53; see also Umbach et al, supra note 9; Clare McGlynn et al, “‘It’s Torture for the Soul’: The Harms of Image-Based Sexual Abuse” (2021) 30:4 Soc & Leg Stud 541 at 543.
-
[39]
Henry et al, supra note 20 at 40, 46, 114.
-
[40]
Flynn et al, “Deepfakes”, supra note 23 at 1351–52.
-
[41]
Umbach et al, supra note 9 at 12, 15.
-
[42]
Matthew B Kugler & Carly Pace, “Deepfake Privacy: Attitudes and Regulation” (2021) 116:3 Nw UL Rev 611 at 639.
-
[43]
Dean Fido, Jaya Rao & Craig A Harper, “Celebrity Status, Sex, and Variation in Psychopathy Predicts Judgements of and Proclivity to Generate and Distribute Deepfake Pornography” (2022) 129 Computers in Human Behaviour 1 at 9, 11.
-
[44]
Daniel Story & Ryan Jenkins, “Deepfake Pornography and the Ethics of Non‑Veridical Representations” (2023) 36:56 Philosophy & Tech 55; Lara Karaian, “Indicting Deepfakes?: Why Gender-Based Violence Frameworks and Censorship Aren’t the Only (or Best) Way to Respond to AI Generated Pornography” (6 March 2023), online (blog): <cfe.torontomu.ca> [perma.cc/4W32-G4HE]; Lara Karaian, “Addressing Deepfake Porn Doesn’t Require New Criminal Laws, Which Can Restrict Sexual Fantasy and Promote the Prison System”, The Conversation (24 March 2024), online: <theconversation.com> [perma.cc/AN3A-848V] [Karaian, “Addressing Deepfakes”].
-
[45]
Olivia B Newton & Mel Stanfill, “My NSFW Video Has Partial Occlusion: Deepfakes and the Technological Production of Non-Consensual Pornography” (2020) 7:4 Porn Studies 398.
-
[46]
Carl Öhman, “Introducing the Pervert’s Dilemma: A Contribution to the Critique of Deepfake Pornography” (2020) 22:2 Ethics & Information Tech 133 at 134–35, 138.
-
[47]
Harshita Ganesh, “Protecting Children Through Deepfake Child Pornography: A Moral, Legal, and Philosophical Discussion on the Intersection of the Evolution in Law and Technology” (2022) 60 Am Crim L Rev 1.
-
[48]
Nicole Krättli, “Fake Porn – Real Victims: How Women Become Targets of Artificial Intelligence” (1 September 2023), online (video): <nzz.ch> [perma.cc/4H24-WF9H]; Danielle Keats Citron, The Fight for Privacy: Protecting Dignity, Identity, and Love in the Digital Age (New York: WW Norton & Company, 2022); Sophie Compton, Reuben Hamlyn & Isabel Freeman, “Another Body” (2023), online (video): <ok.ru> [perma.cc/ U2Z8-764G]; BBC, “Deepfake Porn: Could You Be Next?” (21 October 2022), online (video): <bbc.co.uk> [perma.cc/63SM-XPP2]; Asia A Eaton & Clare McGlynn, “The Psychology of Nonconsensual Porn: Understanding and Addressing a Growing Form of Sexual Violence” (2020) 7:2 Policy Insights from the Behavioral & Brain Sciences 190 at 192.
-
[49]
Jane Bailey & Suzie Dunn, “The More Things Change, The More They Stay the Same: Recurring Themes in Tech-facilitated Sexual Violence Over Time” in Gian Marco Caletti & Kolis Summerer, eds, Criminalizing Intimate Image Abuse: A Comparative Perspective (Oxford: Oxford University Press, 2024) 40 at 41.
-
[50]
Jane Bailey & Carissima Mathen, “Technologically-Facilitated Violence Against Women and Girls: If Criminal Law Can Respond, Should It?” (2017) University of Ottawa Faculty of Law, Working Paper No 2017-44 at 22 [emphasis omitted].
-
[51]
Mary Anne Franks & Ari Ezra Waldman, “Sex, Lies, and Videotape: Deep Fakes and Free Speech Delusions” (2019) 78:4 Md L Rev 892 at 893.
-
[52]
Citron, “Sexual Privacy”, supra note 11 at 1921.
-
[53]
Cynthia Khoo, Deplatforming Misogyny: Report on Platform Liability for Technology-Facilitated Gender-Based Violence (Toronto: Women’s Legal Education and Action Fund, 2021); Suzie Dunn, Tracy Vaillancourt & Heather Brittain, Supporting Safer Digital Spaces (Waterloo: Centre for International Governance Innovation, 2023).
-
[54]
Note that at the time this paper was written, the Manitoba legislature had proposed changes to The Non-Consensual Distribution of Intimate Images Act, CCSM c N93, which have since come into force. See e.g. s 1(1) “fake intimate image”.
-
[55]
For an example of a Canadian statute that does not include altered or fake images, see Criminal Code, RSC 1985, c C-46, s 162.1; for an example of a Canadian statute that does include altered or fake images, see Intimate Images Unlawful Distribution Act, SNB 2022, c 1, s 1.
-
[56]
Michelle Rempel Garner, “Your Ex Used AI to Create Intimate Images of You, and Sent Them to Your Friends. It Might Not Be Illegal.” (7 November 2023), online (blog): <michellerempelgarner.substack.com> [perma.cc/ VL25-K83R]; “Manitoba Introduces Bills To Protect Against AI-Generated Nudes, Prevent Certain Offenders From Changing Names”, CBC News (14 March 2024), online: <cbc.ca> [perma.cc/ZUR9-L9AB].
-
[57]
The Privacy Act, RSS 1978, c P-24, s 7.1.
-
[58]
Bill C-63, An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts, 1st Sess, 44th Parl, 2024, cl 2(1)(a) (first reading 26 February 2024) [Bill C-63].
-
[59]
Bill 24, The Intimate Image Protection Amendment Act (Distribution of Fake Intimate Images), 1st Sess, 43rd Leg, Manitoba, 2024, cl 3(1) [Bill 24]. Note: this bill has since become law.
-
[60]
BJ Siekierski, Deep Fakes: What Can Be Done About Synthetic Audio and Video? (Ottawa: Library of Parliament, 2019).
-
[61]
Vasileia Karasavva & Aalia Noorbhai, “The Real Threat of Deepfake Pornography: A Review of Canadian Policy” (2021) 24:3 Cyberpsychology, Behavior, & Soc Networking 203 at 205–06; Suzie Dunn & Alessia Petricone-Westwood, “More than ‘Revenge Porn’ Civil Remedies for the Nonconsensual Distribution of Intimate Images” (Paper delivered at the 38th Annual Civil Litigation Conference, Mont Tremblant, 16–17 November 2018), 2018 CanLIIDocs 10789; Meghan Sali, “Intimate Images and Authors’ Rights: Non-Consensual Disclosure and the Copyright Disconnect” (2022) 19:2 CJLT 343; Anne Pechenik Gieseke, “‘The New Weapon of Choice’: Law’s Current Inability to Properly Address Deepfake Pornography” (2020) 73:5 Vand L Rev 1479.
-
[62]
Emily Laidlaw, “Technology Mindfulness and the Future of the Tort of Privacy” (2023) 60:3 Osgoode Hall LJ 597 at 647–49.
-
[63]
Citron, “Sexual Privacy”, supra note 11.
-
[64]
Eaton & McGlynn, supra note 48 at 195.
-
[65]
This exact term, “intimate image,” is not used in the child pornography provision, but it does address sexual/sexualized images of children.
-
[66]
Criminal Code, supra note 55, s 163.1(1)(a).
-
[67]
Criminal Code, supra note 55, s 162.1(2).
-
[68]
R v Rhode, 2019 SKCA 17; R v CH, 2010 ONCJ 270; R v RMV, 2015 BCPC 469; R v GJM, 2015 MBCA 103; R v RK, 2015 ONSC 2391.
-
[69]
R v Larouche, 2023 QCCQ 1853.
-
[70]
R v Legault, 2024 BCPC 29.
-
[71]
Joseph Cox et al, “a16z Funded AI Platform Generated Images That ‘Could Be Categorized as Child Pornography,’ Leaked Documents Show” (5 December 2023) at 01m43s–02m49s, online (podcast): <podcasts.apple.com> [perma.cc/ABN7-WGD6].
-
[72]
Note that several common law torts have been recognized in Ontario that address some forms of intimate image sharing. Quebec recently introduced Bill 73, Loi visant à contrer le partage sans consentement d’images intimes et à améliorer la protection et le soutien en matière civile des personnes victimes de violence, 1st Sess, 43rd Leg, Quebec, 2024.
-
[73]
Protecting Victims of Non-consensual Distribution of Intimate Images Act, SA 2017, c P-26.9, s 1(b).
-
[74]
The Intimate Image Protection Act, CCSM c 187, s 1(1). Note that this statute has been updated since the drafting of this paper and now includes a definition of intimate images that includes fake images, which would include many forms of NSII.
-
[75]
Intimate Images Protection Act, SNL 2018, c I-22, s 2.
-
[76]
Intimate Images and Cyber-protection Act, SNS 2017, c 7, s 3(f).
-
[77]
Emily Laidlaw & Hilary Young, “Creating a Revenge Porn Tort for Canada” (2020) 96 SCLR 147 at 158.
-
[78]
Uniform Law Conference of Canada, “Uniform Non-consensual Disclosure of Intimate Images Act (2021)” (1 January 2021), s 1 “Altered Images”, online (pdf): <cdn-res. keymedia.com> [perma.cc/HV86-GADG].
-
[79]
Intimate Images Protection Act, SBC 2023, c 11, s 1.
-
[80]
Intimate Images Unlawful Distribution Act, SNB 2022, c 1, s 1.
-
[81]
Intimate Images Protection Act, RSPEI 1988, c I-9.1, s 1(f).
-
[82]
The Privacy Act, supra note 55, s 7.1.
-
[83]
Bill 24, supra note 59, cl 3(1). Note that since the time of drafting, this bill has come into force: see supra note 74, s 1(1).
-
[84]
Ibid.
-
[85]
Ibid.
-
[86]
Bill C-63, supra note 58, cl 67.
-
[87]
Ibid, cl 2(1).
-
[88]
Ibid, cl 2(1).
-
[89]
Some jurisdictions regulate the creation of NSII: see Clare McGlynn, “Deepfake Porn: Why We Need to Make It a Crime to Create It, Not Just Share It”, The Conversation (9 April 2024), online: <theconversation.com> [perma.cc/H3M7-W47T].
-
[90]
Cox et al, supra note 71.
-
[91]
Dunn, “Identity Manipulation”, supra note 10 at 21.
-
[92]
Suzie Dunn & Moira Aikenhead, “On the Internet No One Knows You Are a Dog: Contested Authorship of Digital Evidence in Cases of Gender-Based Violence” (2022) 19:2 CJLT 371.
-
[93]
Laidlaw & Young, supra note 77 at 167.
-
[94]
McGlynn, supra note 30; see also Keith Raymond Harris, “Video on Demand: What Deepfakes Do and How they Harm” (2021) 199:5–6 Synthese 13373 at 13386.
-
[95]
Cole, “How Sex Changed the Internet”, supra note 20; Miha Šepec & Melanija Lango, “Virtual Revenge Pornography as a New Online Threat to Sexual Integrity” (2020) 15 Balkan Soc Science Rev 117 at 118–19.
-
[96]
For an example of the variety of synthetic sexual fanfiction media, see Milena Popova, “Reading Out of Context: Pornographic Deepfakes, Celebrity and Intimacy” (2020) 7:4 Porn Studies 367 at 372–75.
-
[97]
See e.g. Khoo, supra note 53 at 126.
-
[98]
Karaian, “Addressing Deepfakes”, supra note 44.
-
[99]
Kristen Thomasen & Suzie Dunn, “Reasonable Expectations of Privacy in the Era of Drones and Deepfakes: Examining the Supreme Court of Canada’s Decision in R v Jarvis” in Jane Bailey, Asher Flynn & Nicola Henry, eds, The Emerald International Handbook of Technology-Facilitated Violence and Abuse (Bingley, UK: Emerald Publishing, 2021) 555 at 558; Statistics Canada, A Comprehensive Portrait Of Police: Reported Crime In Canada, 2021, Catalogue No 11-001-X (Ottawa: Statistics Canada, 2022), online: <www150.statcan.gc.ca> [perma.cc/7ZU5-V3W5]; Statistics Canada, Police-Reported Cybercrime, By Cyber-Related Violation, Canada (Selected Police Services), Table no 35-10-0001-01 (Ottawa: Statistics Canada, 2023), online: <www150.statcan.gc.ca> [perma.cc/TG38-TMA8].
-
[100]
Karaian, “Addressing Deepfakes”, supra note 44.
-
[101]
Breanna Sheppard, “Deep Fake Pornography and Section 162.1” (11 March 2022), online (blog): <robsoncrim.com> [perma.cc/9TPB-8533]; Flynn, Clough & Cooke, supra note 8 at 596.
-
[102]
Dunn & Aikenhead, supra note 92 at 372–73.
-
[103]
Dunn, Vaillancourt & Brittain, supra note 53 at 70.
-
[104]
Carissima Mathen, “Crowdsourcing Sexual Objectification” (2014) 3:3 Laws 529 at 530.
-
[105]
Bailey & Mathen, supra note 50 at 4–6.