Article body

Introduction

In January 2024, X (formerly Twitter) was flooded with sexual images of Taylor Swift.[1] These images were created using artificial intelligence (AI) image generators without Swift’s consent.[2] Some of the most well-known AI image generators, including the one used to create the images of Swift, have attempted to curb or prevent the generation of celebrity and sexual images through their products altogether.[3] However, a group of 4chan users challenged each other to find a way to bypass these prompt guardrails and were eventually able to generate the intimate images of Swift that were later widely shared on X.[4] The images were viewed millions of times, and the non-consensual creation and distribution of these images was widely condemned in the media and by some governments.[5] Although this story brought newfound energy to the debate about regulating non-consensual synthetic intimate images (NSII; i.e., sexual images of a person that were created using technology such as AI or Photoshop without their consent), celebrities like Swift have had sexual deepfakes made of them for years without their consent with little legal recourse.[6] As the technology became more accessible and diverse, the breadth of who was targeted also expanded. In today’s world, we see a wide swath of people targeted by NSII, ranging from celebrities to schoolgirls.[7]

Nationally and internationally, there have been various calls to regulate synthetic media.[8] This article does not address the full scope of potential harms that synthetic media in general may cause; its analysis will be limited to various forms of NSII.[9] While it is important that lawmakers address the broader violations associated with synthetic media,[10] NSII requires a different legal analysis compared to other forms of synthetic imagery, because these images contain sexual content that involves the uniquely sensitive issues of sexual integrity and privacy.[11] This article examines some of the existing and proposed legal solutions related to NSII in Canada and discusses the evolving definition of intimate images.

Part I provides an overview of NSII. It describes the meaning of this term, how it fits into the larger umbrellas of image-based sexual abuse (IBSA) and the harms caused by NSII that call for a legal remedy. Part II provides a brief overview of existing and proposed laws in Canada that address NSII. It then examines how the definition of “intimate images” in Canada’s criminal and civil intimate image laws could be inclusive of NSII and discusses some challenges that may arise with various existing and proposed definitions.

I. Overview of Non-Consensual Synthetic Intimate Images and their Harms

A. What Are Non-Consensual Synthetic Intimate Images?

Non-consensual synthetic intimate images (NSII) are intimate images of a person that are created using technology such as AI or Photoshop without the consent of the person featured in them.[12] Synthetic media refers to any form of media that has been digitally manipulated or created to represent something that does not exist in reality, often with the use of AI.[13] This type of media is created using face-swapping technology, such as replacing a person’s face in an existing pornography video or superimposing their face on a live sex video; image manipulation, which adds or changes the information in an image, such as by making a clothed person appear nude; or generative AI, which is used to create entirely new images where a person appears nude or engaged in sexual activity.[14] This technology can be used for creative and positive sexual purposes. However, when used without the consent of the person represented in the image, it can cause harm worthy of legal intervention. NSII includes images where a person is depicted as nude or semi-nude, exposing their genitals, anal region, or breasts, or engaged in sexually explicit activity. Crucially, NSII are made or distributed without that person’s consent. They can be a form of image-based sexual abuse.[15]

Deepfake videos are one of the most well-known forms of synthetic media.[16] Deepfakes are videos or images in which AI is used to alter the content that appears in the images.[17] Deepfakes originated as a form of AI that could swap faces in videos, but the term “deepfake” has been used to describe many forms of synthetic media that misrepresents something or someone using digital media.[18] Typically, a non-consensual sexual deepfake is a video where a person’s face is superimposed on a previously existing pornographic video, resulting in fairly realistic footage that appears as though a person is engaging in sex acts that they did not actually perform.[19] As discovered by journalist Samantha Cole, deepfake technology was popularized in 2017 after a Reddit user posted sexual deepfakes he had made of several famous female celebrities on a Reddit board.[20] By 2023, hundreds of thousands of deepfake videos existed online.[21] Of the publicly available sexual deepfakes, the vast majority are made without the consent of the person in the image and they almost exclusively feature women.[22] Recent studies by Umbach et al. and Flynn et al. also show noteworthy self-reported rates of victimization among men as well,[23] although images featuring men seem less likely to appear online publicly. These studies also show that men are more likely to create and consume sexual deepfakes.

Other forms of AI and digital technologies have been used to create NSII, such as “nudifying” apps that transform still images of fully clothed women or girls into photos where they appear to be fully nude,[24] generative AI used to create entirely new sexual images,[25] or older technology, like photo editing software, that can merge or edit photos of a person to make it falsely appear that they are naked or engaging in sex acts.[26]

Despite many social media and pornography websites banning NSII,[27] the popularity of these images has not waned.[28] Wired reported that an independent researcher found that nearly 250,000 sexual deepfakes were uploaded onto thirty-five of the most popular deepfake pornography websites over the last seven years, with more than 100,000 being uploaded in 2023 alone.[29] In 2024, McGlynn reported that one of the most popular deepfake websites was accessed around 17 million times per month.[30] In 2020, Sensity reported that over 100,000 images of women were artificially stripped nude and shared publicly on Telegram, 70% of these images were taken from social media profiles or used private images of private individuals.[31] These images are sometimes called “deepnudes.” In 2023, Graphika identified over 24 million unique views on thirty-four NSII providers’ websites, showing the growing interest in the technology.[32] As such, the use of this type of technology is becoming more widespread. Further, between 2023 and 2024, news reports emerged of male high school students using this type of technology to nudify images of their female classmates in Winnipeg and London, Ontario.[33] A 2024 New York Times article described the prevalence of this behaviour as an epidemic confronting teen girls.[34] Today, with the advent of AI image generators, users can now create fully new images of people, rather than having to swap out their face or body and those images.[35]

B. Harms of Non-Consensual Synthetic Intimate Images

Research has shown that there are harms associated with NSII. McGlynn and Rackley, and Flynn et al. identify NSII as a form of image-based sexual abuse (IBSA).[36] They define IBSA as the non-consensual creation, distribution, or threat to distribute nude or sexual images of another person, which includes acts such as voyeurism, sexual extortion, and the non-consensual creation and distribution of intimate images, synthetic or real. One of the most extensive studies on IBSA[37] by Henry et al. researched people who had experienced various forms of IBSA, including those who were victims/survivors of NSII. The number of participants in the study who had their images synthetically altered was relatively low compared to other more common forms of IBSA, such as the non-consensual distribution of actual intimate images of a person. However, participants who had experienced either of these forms of abuse reported similar harms, including social rupture, ongoing harms when the images were shared or viewed repeatedly, ongoing fear that the abuse will reoccur, social isolation, and lost freedom, including the ability to trust.[38] The majority of participants found IBSA to be harmful, whereas a smaller percentage reported feeling neutral or even some positive feelings, such as finding the non-consensual use of their images funny or feeling flattered when their images were created or shared.[39]

The research by Henry et al. is an example of the emerging empirical evidence documenting the negative impacts of NSII. Its data has led to scholarship that examines the data from this study on NSII in particular.[40] Additional recent research shows that many people recognize NSII as a social wrong requiring legal intervention. Umbach et al. surveyed over 16,000 people across ten countries about their experiences with deepfakes. The results of their research demonstrated that the majority of participants reported the non-consensual creation and distribution of deepfakes as harmful, with many supporting legal intervention.[41] An American study by Kugler and Pace noted that their participants rated the distribution of non-consensual sexual deepfakes as highly morally blameworthy, even when clearly labeled as fake.[42] Participants from two UK studies by Fido et al. reported that deepfakes were especially harmful when the images were shared publicly rather than used privately, featured women instead of men, or featured people that participants knew personally compared to celebrities.[43] Participants from these studies generally supported a criminal or civil legal response to non-consensual sexual deepfakes.

Despite research concluding that behaviour tied to the distribution of NSII has been recognized as harmful, some non-consensual deepfake creators, consumers, and researchers argue that these images cause few significant harms and that they constitute a form of legitimate sexual fantasy and technological experimentation.[44] Newton and Stanfill found that some deepfake creators fail to see the subjects of their creations as fully human but rather view the images as digital objects to be used in their exploration of the technology.[45] Other deepfake creators, consumers, and researchers claim that these images cause no harm, because they are not real representations of their subject. Further, Öhman argues that the non-consensual creation of sexual deepfakes can be deemed morally permissible when considered individually and are somewhat comparable to sexual fantasies, but may be morally impermissible when considered on the grander scale.[46] In an unconventional argument, Ganesh posits that there may be benefits to keeping some forms of sexual deepfakes of children legal as these images may provide an alternative to child sexual abuse material involving real children and thus reduce the number of actual children being sexually abused.[47] Despite those beliefs, non-consensual sexual deepfakes do cause harm to some of the people featured in them.

Although widespread empirical research is still lagging due to the relatively recent development of this technology, abundant evidence drawn from the lived experience of people targeted by NSII shows the significant emotional, reputational, professional, and financial harms caused.[48] As noted by Bailey and Dunn, NSII is simply one of the newest forms of technology-facilitated gender-based violence that have occurred in digital spaces since the advent of the internet.[49] With each development of new forms of technology, abusers seem to find a way to use them to sexually violate others against their will.

When abusers use technology to cause sexual harms, Bailey and Mathen state that the creator “instrumentalizes” the person they target “using her to achieve his own goals and sublimating her will to his.”[50] An individual’s sexual integrity should be in their control, even in digital contexts. That control is lost when someone makes NSII of them.[51] As argued by Citron, a person’s sexual integrity and privacy is deeply impacted by non-consensual sexual deepfakes that “hijack people’s sexual and intimate identities. ... creating a sexual identity not of the individual’s own making.” [52]

People should have a right to control the sexual boundaries of their digital selves in a similar way that they do with their physical bodies. Unlike personal sexual fantasies, which remain within the mind of the individual, NSII are a real-world manifestation of sexual fantasy that alters the balance of the rights of the people involved. Regardless of an observer’s sexual interest in another person’s body or nude images, the person whose body or image is used to produce NSII should retain control over their sexual experiences, including determining who touches or views intimate images of them. Further, they should have control over whether their images can be used to train and improve generative AI tools used for a sexual purpose. One person’s sexual interest and feelings of sexual entitlement or curiosity should not trump the sexual integrity of the other person. With the increasing prevalence of NSII and the documentation of their harms, it is important for law makers to consider what types of images should be included in intimate image laws.

II. Legal Responses

While several Canadian laws that address NSII directly are already in force, much room for legislative improvement remains. Notably, there are gaps in some of those laws, as not all civil or criminal protections cover synthetic images, and the definition of “intimate images” is inconsistent in legislation across the country. It will take time, research, and some experimentation to develop a law that properly addresses NSII. However, these efforts are worthwhile. NSII are a form of sexual wrongdoing that should be regulated to protect the sexual integrity, privacy, and digital identities of the individuals they feature. People should have the right to control their sexual expression in digital spaces, including limiting the distribution of realistic sexual images of themselves. Those targeted by NSII should have access to effective legal tools to prevent and remedy the harms associated with the breach of this right.

Of course, comprehensively addressing the harms of NSII will require more than just well-crafted legal responses. As noted in previous Canadian research on technology-facilitated gender-based violence and image-based sexual abuse, a holistic approach that includes legal, technical, social, educational, and community based efforts is essential to responding to and preventing these types of harms, particularly by shaping social norms about IBSA.[53] This article focuses on some of the legal responses to NSII, while recognizing that these are only a small piece of the puzzle.

This Part provides a brief overview of some of the existing laws in Canada that address intimate images, including some that address NSII, as well as two new bills that propose to regulate NSII in the future.[54] It then examines current and proposed definitions of prohibited images and explores some of the challenges that may arise in applying NSII to these definitions.

A. Existing and Proposed Laws

In Canada, a variety of laws prohibiting the distribution of intimate images of adults and children have been introduced across the country. Some of the definitions of prohibited images are expansive enough to capture NSII, while others are not.[55] The exclusion of NSII in some of these laws is in part due to how the technology used to create NSII had either not been invented or was not widely used to create NSII at the time of legislative drafting. However, as NSII became more commonplace and concerns about the adequacy of Canadian law to respond to NSII grew, so did the definition of prohibited intimate images.[56] For example, certain provinces began including “altered” in their definition of intimate images, which could capture forms of NSII such as deepfakes and nudifying apps.[57] Other legislation, such as the recently introduced federal Bill C-63, the Online Harms Act,[58] and Manitoba’s Bill 24, the Intimate Image Protection Amendment Act (Distribution of Fake Intimate Images),[59] include even broader definitions of prohibited intimate images that could be widely inclusive of many forms of NSII as long as the images are reasonably convincing.

This article examines statutory laws that explicitly address intimate images. These are not the only laws available for people targeted by NSII.[60] For those wishing to pursue a legal remedy for NSII, there are a variety of existing civil, criminal, intellectual property, and human rights laws in Canada that could apply to NSII, allowing for complaints regarding extortion, defamation, privacy, or copyright.[61] Additionally, leading scholars on IBSA, such as Laidlaw,[62] Citron,[63] Eaton and McGlynn,[64] have called for more general privacy laws that would protect the privacy of targets of IBSA and NSII. However, this article does not examine those laws and limits its analysis to legislation that directly addresses distributing intimate images.

1. Criminal Law: Child Pornography and the Publication of an Intimate Image Without Consent

Canada’s Criminal Code addresses prohibited intimate images under its provisions on child pornography and the publication of intimate images without consent.[65] The child pornography provision prohibits making, distributing, possessing, and accessing “a photographic, film, video or other visual representation, whether or not it was made by electronic or mechanical means” that shows or depicts a person under the age of eighteen engaged in sexual activity, as well as images depicting the sexual organ or anal region of a person under the age of eighteen for a sexual purpose.[66] This definition includes both real and artificial intimate images of children and is inclusive of some forms of NSII.

The non-consensual intimate images provision of the Criminal Code prohibits the distribution of intimate images. It defines an intimate image as “a visual recording of a person made by any means including a photographic, film or video recording” where a person is engaged in sexual activity or is nude, exposing his or her genital organs or anal regions or her breasts, where the person would have a reasonable expectation of privacy at the time the recording was made and at the time of distribution.[67] Although a case involving NSII has yet to test this definition to see if it could be interpreted to include NSII, on a plain reading, it appears to only include authentic intimate images of a person and would therefore not capture NSII.

To date, there are at least two reported criminal cases in Canada where a person was prosecuted under the child pornography provision for making non-consensual sexual deepfakes. However, there are other child pornography cases involving simpler technological means, such as Photoshop.[68] In the 2023 case of R v. Larouche, the accused created sexual deepfakes of children and included them in his collection of real child pornography. He was subsequently convicted of making, possessing, and distributing child pornography.[69] In the 2024 case of R v. Legault, a youth pastor pleaded guilty to making and possessing child pornography.[70] He possessed a collection of child pornography, including at least one image of a teen girl he had “nudified” using the DeepNude app. He had digitally altered an image of a second girl’s face to appear on the body of a nude child. Legault was also in possession of 150 photos of children that the police suspected Legault planned to input into the DeepNude application. These are some of the first NSII child pornography cases, but they will not be the last. With the increase in deepfakes and other forms of synthetic media being used for child sexual abuse material,[71] the number of successful criminal cases involving child-based NSII will likely increase. However, the criminal law likely does not protect adults targeted by NSII. To date, there have been no reported criminal cases that the author is aware of involving NSII and laws surrounding the non-consensual distribution of intimate images.

2. Civil Law: Intimate Image Statutes

In most Canadian provinces, civil intimate image statutes have been introduced over the last decade to provide a civil remedy when intimate images are shared without consent. Of the common law jurisdictions in Canada, only Ontario and the territories have not introduced specific statutory civil intimate image laws.[72] In provinces where intimate image statutes are in force, several include a definition of intimate images that is similar to that in the Criminal Code, including Alberta,[73] Manitoba,[74] Newfoundland and Labrador,[75] and Nova Scotia.[76] Like the Criminal Code, these provincial definitions seem to limit their application to actual intimate images of a person. However, there may be some possibility for the courts to interpret those definitions to be inclusive of NSII. Following recommendations made by Laidlaw and Young[77] and the Uniform Law Conference of Canada,[78] several provinces introduced intimate images statutes that defined “intimate images” more broadly to be clearly inclusive of deepfakes. British Columbia,[79] New Brunswick[80], Prince Edward Island,[81] and Saskatchewan[82] defined intimate images to include images that have been altered to depict a person engaged in sexual activity, nude (and in some cases such as British Columbia or Prince Edward Island, nearly nude), or exposing their genital organs, anal region or breasts. The definitions in these provinces further require that the person had a reasonable expectation of privacy at the time of the relevant recording, distribution—or, in British Columbia and New Brunswick—simultaneous recording or live stream. These definitions clearly include forms of NSII where an image was altered to depict an intimate image, such as deepfakes and nudified images; however, it remains to be seen whether they are broad enough to capture fully generated images, like those made of Swift, where there is not a clear original image that was altered. Considering the meaning and purpose behind this new definition of intimate image, it seems reasonable that the courts could and should interpret these types of images as altered intimate images. However, there is a risk that some may not. Generative AI technology requires original images of a person in order to generate novel depictions in their likeness. Therefore, the generated images could arguably be considered altered images, per some civil intimate image statutes. To date, no cases involving NSII under any of these acts have been reported.

In March 2024, Manitoba introduced Bill 24, The Intimate Image Protection Amendment Act (Distribution of Fake Intimate Images). It sought to revise the definition of intimate images to include fake intimate images, which would encompass any type of visual recording “that, in a reasonably convincing manner, falsely depicts an identifiable person” engaging in explicit sexual activity or as being nude, or exposing their genital organs, anal region, or breasts.[83] The definition further specifies that this includes images created through “the use of software, machine learning, artificial intelligence or other means, including by modifying, manipulating or altering an authentic visual representation” and that “it is reasonable to suspect that the person depicted in the image would not consent to the recording being made or distributed to others.”[84] These false depictions are distinguished from a “personal intimate image” of a person, which is an authentic intimate image where a person would have a reasonable expectation of privacy at the time the image was recorded and distributed.

This bill has since come into force. Manitoba is the first province to update its intimate image statute to include this broader definition of intimate images.[85] Interestingly, there is no reasonable expectation of privacy requirement in the definition of a fake image. Instead, the definition focuses on whether the person would have consented to the distribution or not.

3. Federal Online Harms Bill

In 2024, the federal government introduced Bill C-63, enacting the Online Harms Act, which aims in part to regulate social media companies. It creates a special duty requiring social media platforms to make content that sexually victimizes a child, re-victimizes a survivor, or is intimate and communicated without consent inaccessible to all persons in Canada.[86] The bill’s definition of content that sexually victimizes a child or revictimizes a survivor includes a visual representation where a child or someone who is depicted as being a child is engaged in explicit sexual activity, or an image depicting a child’s sexual organs or anal region for a sexual purpose, as well as some other sexualized images involving children.[87] Like the Criminal Code’s child pornography provisions, this definition is broad enough to capture real or synthetic images. It does not include any additional requirements pertaining to a reasonable expectation of privacy or consent to the communication of the images. The bill’s definition of intimate content includes actual images and images that “falsely presents in a reasonably convincing manner” a person engaged in sexual activity, nude, or exposing their sexual organs or anal region. [88] Deepfakes are explicitly named as a type of image that would fit within this definition, which is likely broad enough to capture other forms of NSII, so long as they are “reasonably convincing.” For false images, there must be reasonable suspicion that its subject did not consent to the image being communicated. Both the federal and Manitoba definitions move away from the reasonable expectation of privacy standard for NSII to a standard based on consent to distribution. Neither address the non-consensual creation of these images.[89]

B. Defining Intimate Images

To address NSII, governments could consider expanding their definition of intimate images to include synthetic or false but reasonably convincing intimate images, as Manitoba did and as the federal government has proposed. Over the last decade, the evolution of intimate image laws has shifted in some areas of law from a definition seemingly limited to authentic intimate images to one that includes altered images. Today, there are proposals to include false images that are reasonably convincing in some intimate image laws. These evolving definitions have aligned with the technological and social circumstances at the time of their legislative introduction. Early laws reacted to actual intimate images released without consent; these were followed by newer legislation that restricted the distribution of altered images due to the rising popularity of deepfakes and nudifying apps. Now, with generative AI, the definition of intimate images in Manitoba and the proposed definition in the Online Harms Act have expanded even further to include intimate images that are reasonably convincing but falsely depict someone nude or engaged in a sexual act. The following section discusses some of the challenges that will arise with the development of the definition of intimate images to include synthetic or fake but reasonably convincing images under both criminal and civil non-consensual intimate image sharing laws.

The definition of “altered” intimate images in many civil intimate image laws was introduced to address deepfakes. One would hope that this definition will be interpreted to include novel images such as those created using generative AI, but this definition raises some potential risks due to the suggestion that it may require an original image that was altered in order to be captured under the statute. This definition would likely include face swapping deepfakes, nudifying images, and photoshopped images where the original images are identifiable in the final image. However, the definition of “altered” intimate images creates a risk that it may not capture cases such as those involving generative AI[90] or hyper-realistic avatars created for virtual reality sexual simulations,[91] where an original image may not be easily identifiable or exist at all. Further, given how it is increasingly possible to create a realistic digital image of a person without an original photo of them as a base, lawmakers must be alert to these technological advancements when crafting laws addressing the evolving harm presented by AI and hyper-realistic depictions. The potentially underinclusive definition of “altered” intimate images could also pose evidentiary hurdles if the original altered image must be identified or produced within the context of an already complex evidentiary area of law.[92]

If these laws are meant to protect individuals from realistic fake intimate images, the definition of “altered” must either be interpreted broadly enough to capture realistic images of a person whether an identifiable original image was altered or not, or a more encompassing term should be used to capture NSII, such as synthetic images or fake images that are reasonably convincing.

As intimate images laws move towards a definition of reasonably convincing false images, the line between what type of content falls within this definition and what does not is sure to be contested. It is important to note that not all non-consensual sexual representations should be captured by intimate image laws, and that certain defenses should be available.[93] For example, a crudely drawn digital image of a person depicted engaging in sexual activity should not fit in these definitions of regulated intimate images. Such a portrayal may be insulting or offensive, but it should not cross the threshold into regulated images. Alternatively, many NSII are realistic but of such poor quality that an unaided observer can identify them as fake. Notably, many hyper-realistic NSII are explicitly labeled to inform the viewer that they are fake, such as those featured on deepfake porn websites that clearly state that the videos are deepfakes, or are understood as fake due to the celebrity status of the person featured in them.[94] Despite this, these images can still cause cognizable harms, including sexual integrity harms, particularly when distributed.[95] The line between what types of images capture the plaintiff’s likeness and purport to depict reality and which do not will need to be thoughtfully addressed by lawmakers and the courts. This crucial analysis should focus on upholding the sexual integrity of the image’s subject.

Deciding which altered images and which false images that are reasonably convincing should fit within legal regulation will be challenging.[96] For example, how should lawmakers categorize very realistic altered images that do not purport to depict reality, such as a high quality deepfake of a person who is depicted in a sexually graphic scene set in outer space? The context of the image may not depict reality, thus impacting the consideration of whether it is reasonably convincing. However, the images of the subject’s face and body may look perfectly real and harm their sexual integrity if shared. As such, placing a realistic intimate image in an unrealistic context should not be enough to exclude the depiction from the definition of intimate image. Alternatively, where an image is in a realistic setting but is of lower quality, the level of what is reasonably convincing will also be challenged. Such could be the case of a non-consensual sexual deepfake that is a bit glitchy or an AI generated image that has the overly shiny quality that is typical of them. These images may signal to the viewer that they are digitally created and not real, but small technical flaws should not be enough to exclude them from this definition so long as the person is identifiable and the image has a clear realism to it. Fundamentally, the threshold for legal regulation should not to be so high as to exclude the majority of realistic but still clearly digitally created NSII when examined closely and not so low to include any artistic rendering of a nude person, such as a crude line drawing or unrealistic cartoon.

The expansion of civil non-consensual distribution of intimate image laws and federal social media content moderation laws to include NSII are less controversial than introducing criminal provisions. Many agree that laws supporting swift orders for takedowns and the deletion of the non-consensually distributed intimate images should be introduced.[97] Conversely, the criminalization of these images is highly debated.[98] As with all criminal laws, there is a need for greater scrutiny when considering criminalizing NSII due to the significant risks to liberty and potential human rights challenges related to freedom of expression that could arise. Additionally, many argue that a carceral approach has been ineffective in addressing sexual harms in Canada[99] and should not be used for what some argue are expressive behaviours.[100] Others support a criminal approach as a remedy for targets of this type of abuse.[101] Although the criminal justice system presents significant systemic barriers to assisting victims of IBSA[102] and should not be the primary source of legal intervention,[103] scholars such as Mathen argue that criminal law plays an important expressive function in recognizing sexual wrongs that engage in blameworthy sexual objectification leading to individual and systemic harms, such as forms of IBSA.[104] Since their introduction in Canada, criminal law provisions have proven to be an effective tool for some people targeted by specific forms of IBSA, such as victims of voyeurism and the non-consensual distribution of intimate images.[105] If the criminal law definition of intimate images is expanded to include NSII, it may require a deeper scrutiny of which behaviours would be captured by the criminal provision and what punishments would be appropriate.

Conclusion

This article reviews one possible option for addressing NSII: broadening the language of criminal and civil intimate images laws to include NSII in legal regulation. It does not suggest that this is the only or even the best solution to address NSII. However, with the proliferation of sexualized generative AI, it is one worth considering. Assessing the effectiveness of the legal and non-legal approaches used to fully address these harms in response to technological development will take time. While solutions addressing NSII are developing, it is undeniable that NSII causes real harms to many of the individuals featured in them and that legal action is needed. Legal intimate image protections should cover both real and synthetic images, ensuring that people have effective rights and supports to protect their sexual integrity, digital or otherwise.