by Brittany Bruner*

Nonconsensual deepfake pornography (“NDFP”) damages victims and leaves people without the ability to control how they are intimately portrayed. To date, no federal statute explicitly addresses NDFP, but Congress recently passed 15 U.S.C. § 6851, a civil statute that addresses nonconsensual pornography (“NCP”). This Contribution argues that 15 U.S.C. § 6851 applies to NDFP and that NCP and NDFP should be considered First Amendment exceptions and, therefore, should not be protected speech. This statute provides an avenue by which to create these exceptions. Finally, even if NCP and NDFP are not deemed First Amendment exceptions, applying the statute to NDFP passes strict and intermediate scrutiny.


Nonconsensual deepfake pornography (“NDFP”) involves a mashup of a real person with another video or image that may or may not also be of a real person. For example, NDFP videos will often feature a pornographic video with another person’s face grafted onto the video. NDFP poses a significant risk of harm: It can be used as a tool for intimate partners to control, coerce, and/or blackmail their partners,1 or as a tool for profit.2 And it can have lasting and harmful consequences for victims.3 Mere possession of the NDFP is enough for abusers to exert control over victims,4 and those who create it should be held accountable. Given its novelty, there is no federal statute that explicitly addresses NDFP. However, Congress recently created a private right of action for nonconsensual sharing of intimate visual depictions under Section 1309 of the Violence Against Women Act Reauthorization Act of 2022.5 The statute regulates nonconsensual pornography (“NCP”),6 but does not specifically limit itself to NCP; therefore, it should also apply to NDFP.

This Contribution will analyze how victims of NCP—and, by extension, NDFP—can pursue a civil action under 15 U.S.C. § 6851. It then addresses the First Amendment challenges the statute may face and argues that NCP and NDFP should be recognized as exceptions to First Amendment protection, similar to the child sexual abuse material (“CSAM”)7 exception and other unprotected speech. Recognizing this exception under 15 U.S.C. § 6851 does not run afoul of overbreadth doctrine. Finally, this Contribution analyzes how, if an exception is not created for NCP and NDFP, 15 U.S.C. § 6851 would survive both intermediate and strict scrutiny.

First, 15 U.S.C. § 6851 should apply to NDFP. The statute directly addresses NCP, so courts can analogize NDFP to NCP. For background, while NCP and NDFP differ in how they are created and attained, they share crucial similarities and are both extremely harmful to victims, as will be described below. NCP generally features the victim as a whole, whereas NDFP morphs a real person with another image or video (often of a nude or partially nude body) that may or may not be of a real person.

NCP “refers to sexually explicit images and video disclosed without consent and for no legitimate purpose.”8 It comes in many forms, including “footage obtained by hidden cameras, consensually exchanged images within a confidential relationship, stolen photos, and recordings of sexual assaults,”9 and is often used as a means of control or embarrassment.10

NDFP is newer.11 The term “deepfake” originated with a user who, calling themself “deepfakes,” created AI-generated pornographic videos by placing celebrities’ faces on the nude bodies of pornographic actors.12 “Deepfake” is now popularly used to refer to any fake video, and the proliferation of deepfakes is expected to rapidly increase.13 Most deepfake videos are pornographic, created without consent, and feature women;14 many feature people the creators know;15 and deepfake sex videos have become more realistic and are “increasingly difficult to debunk.”16

Although NDFP and NCP differ in medium (with NDFPs being AI-generated), the harmful effects are often the same.17 First, NCP and NDFP “can cause immediate and irreversible harm” because the images can be made immediately widely available and can go viral, “at which point it can dominate the search engine results for the victim’s name.”18 Victims have a significant chance of experiencing adverse consequences such as emotional harm, mental health challenges, difficulty with employment, suicidal ideation, withdrawal from social situations, extreme anxiety, and eating disorders.19 Second, NCP and NDFP are both dangerous tools for intimate partner violence. Domestic violence abusers use NCP to control victims and to try to prevent them from leaving a relationship.20 Third, NCP and NDFP breach victims’ privacy, leaving victims with little control over their sexual identities online and allowing abusers to share their intimate lives (or AI-generated projections of their intimate lives) online.21 Victims may feel unable to prove that an NDFP is, in fact, fake, and may feel compelled to follow the wishes of their abusers who use the NDFP to control them.22

While there are currently no federal statutes that explicitly address NDFP,23 there is now a civil statute addressing the nonconsensual disclosure of intimate images, which victims of NDFP should use to bring a cause of action for civil recourse. 15 U.S.C. § 6851 creates a private civil right of action with $150,000 in damages, action costs, reasonable attorney’s fees, and other litigation costs.24 Additionally, the court has discretion to grant other relief “including a temporary restraining order, a preliminary injunction, or a permanent injunction ordering the defendant to cease display or disclosure of the visual depiction.”25 The statute provides a right to a federal private civil action against a person who knowingly or with reckless disregard discloses an individual’s “intimate visual depiction.”26 A “depicted individual” is defined as

an individual whose body appears in whole or in part in an intimate visual depiction and who is identifiable by virtue of the person’s face, likeness, or other distinguishing characteristic, such as a unique birthmark or other recognizable feature, or from information displayed in connection with the visual depiction.27

This language paints with a broad stroke by including any body part or information whatsoever that links to an individual. The last clause suggests that if someone were to attach a name, contact information, or other identifying information to an image, even if not identifiable by facial or other body features, it would also fall under this statute. Furthermore, an “intimate visual depiction” covers both nude depictions and depictions of sexual acts.28 While the statute includes some exceptions, none address deepfakes. Thus, because a deepfake involves a mashup of a depicted individual onto another individual, the victim’s body would “appear[] in whole or in part in an intimate visual depiction,”29 and if it can be identified as the victim, then the plain text of 15 U.S.C. § 6851 provides a cause of action for NDFP.

Plain text and proximity to covered conduct (NCP) indicate that 15 U.S.C. § 6851 should apply to NDFP, but 15 U.S.C. § 6851 is new and therefore has not yet been tested by constitutional challenges. In particular, defendants might assert First Amendment protection of their conduct, in response to which victim-plaintiffs should argue that NDFP is unprotected speech.

The Supreme Court has not designated NCP or NDFP as an exception to First Amendment protections. Like the CSAM exception, which recognizes the harm to victims as a valid reason to designate it as an exception to First Amendment protections,30 the Supreme Court should designate NCP and NDFP as unprotected speech. The Court created the CSAM exception because the conduct at issue did not fit within the obscenity exception, which addresses the offensiveness of underlying content whereas CSAM focuses on the act of production and its harmful effects.31 The Court found discussions on the value of the underlying content to be irrelevant in creating the CSAM First Amendment exception.32 Here, too, 15 U.S.C. § 6851 is aimed at addressing the nonconsensual sharing of the intimate visual depictions rather than their underlying content. While the CSAM exception does not perfectly fit with NCP and NDFP, the exception creates a framework to carve out an exception to First Amendment protection for NCP and NDFP because the issue with NCP and NDFP is the nonconsensual disclosure of the intimate visual depictions, not the sexually explicit or intimate content of the images themselves.

The Supreme Court created the First Amendment exception for CSAM in New York v. Ferber, finding that regulations on CSAM can be exempt from the First Amendment regardless of whether the material is considered obscene.33 The Court provided five reasons for creating the exception. First, they found it compelling for the statute to protect children’s “physical and psychological well-being”34 because, as noted by the legislature and relevant literature, “the use of children as subjects of pornographic materials is harmful to the physiological, emotional, and mental health of the child.”35 Therefore, under the first consideration, which analyzes the state’s interest in passing legislation, the statute at issue in Ferber “easily passes muster under the First Amendment.”36 Second, the Court found that distribution of CSAM is “intrinsically related to the sexual abuse of children” because “the materials produced are a permanent record of the children’s participation and the harm to the child is exacerbated by their circulation.”37 The Court recognized that, in order to address the harm of sexually exploiting children through producing CSAM, “the distribution network for child pornography must be closed.”38 Third, people sell and advertise CSAM for money, which adds to “the production of such materials, an activity illegal throughout the Nation.”39 Fourth, the value of permitting live performances and photographic reproductions of children “engaged in lewd sexual conduct is exceedingly modest, if not de minimis.”40 Fifth, a CSAM exception to First Amendment protection “[wa]s not incompatible with [the Court’s] earlier decisions” that “balance . . . competing interests,” in part because the harm of CSAM as a whole so heavily affects the welfare of the children involved that “the balance of competing interests is clearly struck and . . . it is permissible to consider these materials as without the protection of the First Amendment.”41 The Court also noted that “the evil to be restricted so overwhelmingly outweighs the expressive interests, if any, at stake, that no process of case-by-case adjudication is required.”42 Ferber does not require the portrayed sexual conduct to be offensive, and the material “need not be considered as a whole.”43 Thus, the Court created the CSAM exception as its own exception based on the harms enacted on children due to the production, exhibition, distribution, and sale of CSAM rather than focusing on whether the underlying content is obscene.

Considering the Ferber analysis, the Court should create a First Amendment exception for NCP and NDFP. The harms of NCP can be analogized to the five harms addressed in Ferber. First, NCP also negatively affects the physical and psychological well-being of the victims.44 While the subjects are not minors, the negative physical and psychological effects are significant and should bear on any First Amendment analysis regulating NCP.45 Second, while NCP does not necessarily amount to sexual abuse of the subject, it does create a permanent sexual record of a nonconsenting victim, a harm directly created by the NCP, and regulating it can help prevent the spread of NCP.46 Third, NCP is not a federal crime, but some states do criminalize it. Regardless, the harms should not be minimized based on whether the behavior is criminalized,47 so this should not preclude NCP from being excepted from First Amendment protections. Fourth, the value of permitting live performances or photographic reproductions resembling NCP is also “exceedingly modest, if not de minimis.”48 Fifth, if a statute regulating NCP is properly narrow, there is no balancing-competing-interests issue because the negative effects on victims far outweigh any benefits to those who knowingly share or create the NCP, and any outliers can be resolved on a case-by-case basis. Although the harm from NCP is not always from its production (some NCP involves victim-created content that is shared nonconsensually), the act of sharing is harmful. The distinction in Ferber focused on the lack of any real child being affected by the production of virtual images; that is, the harm should focus on how it affects the victim rather than it being tied solely to the abuse of production.

While the Court found the statute at issue in Ferber was sufficiently narrow to be within the CSAM exception, they still recognized the exception must have some limits.49 A statute cannot be overbroad, or “incapable of limitation”; narrower statutes lessen the chill of speech.50 But, a properly narrow statute will not overreach to punish legitimate speech, and concerns about overbreadth in these contexts can be resolved on a case-by-case basis.51

In Ashcroft v. Free Speech Coalition,52 the Court struck down parts of the Child Pornography Protection Act (CPPA) as overbroad, finding the overbroad parts did not meet the CSAM exception.53 Ashcroft involved a challenge to the CPPA on the basis that it prohibited CSAM of sexually explicit images “that appear to depict minors but were produced without using any real children.”54 This included possession or distribution of images “which may be created by using adults who look like minors or by using computer imaging.”55 The Court found this statute “prohibiting child pornography that does not depict an actual child” to be beyond the reach of Ferber.56

The Ashcroft Court noted that in Ferber, “[t]he production of the work, not its content, was the target of the statute,” which was aimed at protecting children’s interests.57 Ferber found the production of CSAM harmed children by creating “a permanent record of a child’s abuse” that impacts the child further through continued circulation, similar to defamation.58 Nevertheless, Ashcroft recognized that virtual CSAM images could be a permissible alternative to real images because some may have literary or artistic value.59 The Court rejected the Government’s argument that virtual images are just as harmful as real ones because they are indistinguishable and therefore “promote the trafficking in works produced through the exploitation of real children.”60 The Court found this implausible because if the virtual images were just as good as the real thing, the market would just be full of virtual images.61 The Court made clear that Ferber’s holding is really about the impact on actual children depicted in the pornography, and whether the images depict a real versus a virtual child is a crucial distinction in evaluating its harms.62

Finally, the Court mentioned the concept of “computer morphing,” which involves using images of real children to create the virtual images of CSAM, and stated, without deciding, that it was closer to the exception condoned in Ferber because real children’s interests were implicated.63

Under Ashcroft, 15 U.S.C. § 6851 is narrow enough in scope to address the harms of NCP without covering too many areas of protected speech. The statute creates a cause of action against a person who “knows” or “recklessly disregards” that an “individual has not consented” to disclosure of their intimate images.64 Thus, the statute addresses the conduct of uploading a person’s intimate visual depictions without consent rather than limiting the sexually explicit conduct itself. This puts it outside the obscenity exception but fits closely with the CSAM exception because it addresses harmful behavior related to pornography. The statute also creates exceptions that properly narrow the statute; thus, the statute is not overbroad because it makes exceptions for legitimate purposes of the images.65 These exceptions help ensure that the statute will not unintendedly restrict otherwise permissible speech. The statute also imposes a knowledge or recklessness standard,66 which precludes causes of action against people who are unaware that the intimate visual depictions were created without consent. This, along with the listed exceptions, narrowly tailors the statute, and it should survive any overbreadth challenges.

The Supreme Court has recently shown reluctance to create new categorical exceptions to the First Amendment but has not outright rejected the possibility of creating new categories. For example, in United States v. Stevens, the Court rejected a contention that depictions of animal cruelty, rather than the act itself, was the type of speech historically prohibited and therefore outside First Amendment protections.67 The Court also refused to adopt a test for creating new exceptions to the First Amendment that would balance value with societal costs.68 Distinguishing Ferber, the Court noted that Ferber’s holding did not rest only on a “balance of competing interests alone,” but relied on an intrinsic relation between child abuse as well as the market for CSAM and its illegal production.69 The Court also noted a connection between the CSAM exception and the valid criminalization of CSAM.70 While acknowledging Stevens did “not foreclose the future recognition of such additional categories,” the Court suggested that categories of speech that could fall under an exception would need to be “categories of speech that have been historically unprotected,” which may include those that “have not yet been specifically identified or discussed . . . [in] case law.”71

As indicated above, application of the five Ferber considerations indicates that NDFP should be treated similarly, but if a First Amendment exception for NCP and NDFP had to be rooted in history and tradition as Stevens suggests, one could argue that the harms of NCP and NDFP “closely parallel those [harms] of child pornography, defamation, and public disclosure of private fact—all of which are historically unprotected speech.”72 NCP, and by extension NDFP, does have roots in criminality, such as “extortion, stalking, harassment, and rape, as well as . . . unlawful sex discrimination.”73 Thus, viewing history broadly and in recognition of modern forms of information sharing, statutes regulating NCP do have those historical analogs. That said, if the Court creates a First Amendment exception for NCP using the reasoning of Ferber rather than Stevens, the Court should not have to find a historical analog to create the exception.74

If NCP is unprotected speech, the next question is whether NDFP is too. Recent precedent from the courts of appeals suggests that if NCP is an exception based on Ferber’s reasoning, then NDFP should be as well. In Ashcroft, the Court quickly mentioned “computer morphing,” which involved using images of real children to create the virtual images of CSAM.75 The Court stated it was closer to the CSAM exception in Ferber because real children’s interests were implicated, but did not decide the issue.76 Now, three out of four of the circuit courts that have addressed the issue have found that morphed CSAM is unprotected speech.77

The Second, Fifth, and Sixth Circuits have the better reasoning than the Eighth Circuit, and their reasoning can be analogized to NCP and NDFP: The harms between NCP and NDFP are indistinguishable,78 and having an underlying criminal or historical analog requirement is irrelevant to the harms caused.

Importantly, some Supreme Court Justices have hinted at their willingness to uphold properly tailored regulations on technological advances, which should apply to NDFP. For example, Justice Thomas in his Ashcroft concurrence suggested that technological advances that make it difficult to discern whether the virtually created images are of real children may warrant appropriately narrow legislation.79 In Justice O’Connor’s Ashcroft dissent, joined by Chief Justice Rehnquist and Justice Scalia, O’Connor noted her concern about defendants claiming their CSAM is computer-generated to avoid liability.80 Justice Gorsuch, dissenting on a denial of certiorari in a defamation case, suggested technology changes may require updates to defamation law.81 While none of these directly address NDFP, they suggest an openness to considering statutes that regulate new developments based on new technology.82 AI is certainly new technology.

Based on the foregoing analysis, NDFP should be considered an exception to the First Amendment under the properly tailored 15 U.S. Code § 6851. Alternatively, if the Court does not find such an exception, 15 U.S. Code § 6851 should survive both intermediate and strict scrutiny.

Restrictions on protected speech are not limited only to judicially created exceptions.83 15 U.S.C. § 6851 should be afforded intermediate scrutiny because the statute is a time, place, and manner restriction rather than a content restriction. The statute does not address the underlying content of the images themselves; rather, it addresses the nonconsensual manner in which they were shared.84 Arguably, 15 U.S.C. § 6851 is designed to protect privacy rather than make judgments on viewpoints, making it a content-neutral time, place, and manner restriction, which would mean it is subject to lower scrutiny than content-based regulations.85 Even if 15 U.S.C § 6851 were a content restriction, the speech it regulates is not “core political speech that receives the highest level of First Amendment protection.”86 Since NDFP is not speech on a matter of public concern, it should not be subject to strict scrutiny.87

Intermediate scrutiny requires a law to be “‘narrowly tailored to serve a significant government interest.’ In other words, the law must not ‘burden substantially more speech than is necessary to further the government’s legitimate interests.’”88 NCP and NDFP implicate great privacy concerns because perpetrators share victims’ images without consent, and NCP and NDFP cause great social, emotional, and psychological harms.89 15 U.S.C. § 6851 is a reasonable fit to protect against these harms—in other words the ends—because its means target people who knowingly or  recklessly share the images without consent. Thus, the regulation should survive intermediate scrutiny.90

Some scholars assert NCP laws should trigger strict scrutiny because they are both content-based and viewpoint-based.91 However, 15 U.S.C. § 6851 focuses on the harm of disclosing the images rather than a specific viewpoint, so it should not be subject to strict scrutiny.92 Nevertheless, the statute would still survive strict scrutiny because it is “narrowly tailored to address compelling government interests,” namely preventing the real life harms that come from sharing NDFP and NCP.93 Applying strict scrutiny in this context would comport with other widely recognized restrictions (e.g., legal restrictions protecting privacy).94 Under strict scrutiny, statutes regulating content “must be narrowly tailored to promote a compelling Government interest,” and they must use the least restrictive means available.95 The Eighth Circuit has found that morphed CSAM survives strict scrutiny because minors feel “the damage from a morphed image” that “‘necessarily follow[s] from the speech’ itself.”96 The court then reasoned that “[t]he government thus has a compelling interest in protecting innocent minors from the significant harms associated with morphed images” and there were no less restrictive means other than to regulate its use.97 Thus, the statute survived strict scrutiny.98 Analogizing to NDFP and NCP, 15 U.S.C. § 6851 provides recourse for the significant harms that come from NCP and NDFP,99 and there is no less restrictive way to force perpetrators to stop sharing NCP and NDFP than to regulate its use because they otherwise have no incentive to stop the behavior.

Additionally, 15 U.S.C. § 6851 should survive strict scrutiny because it provides recourse for invasions of sexual privacy, and laws protecting private information are commonly viewed as constitutional.100 Sexual privacy affects our freedom to control how “we can manage the boundaries around our bodies and intimate activities.”101 Courts have recognized that visual and aural sexual depictions should be considered private.102 Violations of privacy can hinder victims’ speech out of fear of further dissemination of NCP or NDFP.103 Advocates against NCP note that women have to guard themselves in public to avoid “upskirt” and “downblouse” photography, which creates a chilling effect on their expression.104 This is exaggerated with NDFP because it can be created with any image; thus, the only way to completely prevent it would be to have no digital images of yourself. Additionally, NDFP takes control of and creates another person’s sexual identity, even if the depictions do not feature the intimate portions of a person’s body, which is a significant invasion of privacy.105 Finally, NCP and NDFP disproportionately affect women, minors, people of color, and sexual minorities, and therefore regulations against them should be upheld under First Amendment and Fourteenth Amendment principles, as regulation is a necessary means by which to achieve the government’s ends of preventing NDFP and NCP privacy invasions, especially those affecting protected classes.106 Thus, these privacy concerns create a compelling government interest in regulating NCP and NDFP, and 15 U.S.C. § 6851 should survive strict scrutiny.

While the Court should create an exception to the First Amendment for NCP and NDFP because of its close analogs to the CSAM exception and its de minimis value, properly tailored statutes like 15 U.S.C. § 6851 should withstand First Amendment intermediate and strict scrutiny challenges.

NCP and NDFP are pernicious invasions of victims’ privacy and are powerfully oppressive tools for perpetrators and domestic violence abusers. NCP and NDFP allow another person to control a victim’s sexual identity and with whom that sexual identity is shared. This speech does not have more than de minimis value. While unfortunately the federal government has been slow to regulate NCP and NDFP, they have created a federal civil cause of action, 15 U.S. Code § 6851, that provides some recourse for victims. Actions brought under § 6851 should withstand constitutional challenges because the statute is narrowly tailored and NCP and NDFP fit within the Court’s prior reasoning for creating an exception to First Amendment protections. Namely, NDFP and NCP provide de minimis value, and “the evil to be restricted so overwhelmingly outweighs the expressive interests, if any, at stake, that no process of case-by-case adjudication is required.”107 Any regulations may still be subject to other challenges, such as overbreadth if they are not properly cabined like 15 U.S.C. § 6851 to make room for legitimate use of the behavior the statute covers.108 Finally, even if the Court were not to find an exception, statutes regulating NDFP and NCP like 15 U.S.C. § 6851—which both prevents significant harm and is narrowly tailored to that end—should survive both intermediate and strict scrutiny. NDFP and NCP have no place on the Internet, and people who share it should not find cover under the First Amendment.


* Brittany Bruner is a J.D. Candidate (2024) at New York University School of Law. This Contribution was developed from Professor Emily Sack’s Domestic Violence Law Seminar taught during Fall 2023.

1. This piece does not focus on websites/hosting sites, which can be very difficult to sue under Section 230 of the Communications Decency Act and the First Amendment. The best way to address website behavior may be through company action. See Kevin Roose, Here Come the Fake Videos, Too, N.Y. Times (Mar. 4, 2018), https://www.nytimes.com/2018/03/04/technology/fake-videos-deepfakes.html (last visited Apr. 15, 2024) (noting that Twitter and Pornhub banned nonconsensual deepfake (“NDFP”) videos of celebrities and Reddit closed some deepfake pages); Benjamin Goggin, From Porn to ‘Game of Thrones’: How Deepfakes and Realistic-looking Fake Videos Hit it Big, Business Insider (Jun. 23, 2019, 10:45 AM), https://www.businessinsider.com/deepfakes-explained-the-rise-of-fake-realistic-videos-online-2019-6 (noting that Twitter, Discord, Gfycat, and Pornhub have banned deepfakes and associated communities, and Reddit has banned pornographic deepfakes. Additionally, “Gfycat . . . announced that it was using AI detection methods in an attempt to proactively police deepfakes.”). A lot of NDFP is created by anonymous users, which also makes it difficult to sue or punish the creators. This Contribution will focus on situations where the victim knows the creator of the NDFP and how to pursue a civil action against that actor.

2. See Cyrus Farivar, Etsy Has Been Hosting Deepfake Porn of Celebrities, Forbes (Dec. 20, 2023, 10:19 AM), https://www.forbes.com/sites/cyrusfarivar/2023/12/20/etsy-has-been-hosting-deepfake-porn-of-celebrities/?sh=7450f4bf5927.

3. See, e.g., Nina I. Brown, Deepfakes and the Weaponization of Disinformation, 23 Va. J.L. & Tech. 1, 13–14 (Spring 2020) (noting the harm that can occur, such as blackmail or identity theft, as soon as the video is distributed, especially if targeted to a small group).

4. See id. (noting the psychological effect of knowing the NDFP exists, even if it is not shared).

5. 15 U.S.C. § 6851(b)(1)(A) (“[A]n individual whose intimate visual depiction is disclosed . . . without the consent of the individual, where such disclosure was made by a person who knows that, or recklessly disregards whether, the individual has not consented to such disclosure, may bring a civil action against that person in an appropriate district court of the United States.”); see also Congressional Research Service, Federal Civil Action for Disclosure of Intimate Images: Free Speech Considerations (2022), https://crsreports.congress.gov/product/pdf/LSB/LSB10723#. There are no federal criminal actions regulating nonconsensual pornography (“NCP”) or NDFP. While there are some state statutes that regulate NCP, they are outside the scope of this Contribution.

6. NCP is often “misleadingly referred to as ‘revenge porn,’” and “refers to sexually explicit images disclosed without consent and for no legitimate purpose.” Mary Anne Franks, Democratic Surveillance, 30 Harv. J.L. & Tech. 425, 481 (Spring 2017). This Contribution will use the term NCP throughout rather than “revenge porn,” unless quoting a source.

7. While the caselaw and statutes generally refer to “child pornography,” “legislators, law enforcement, and advocates increasingly use terms that emphasize the abusive nature of the content, such as ‘child sexual abuse material’ or ‘child sexual exploitation and abuse imagery’ (CSEAI).” Meg Hennessey, et al., AI-Generated Child Sexual Abuse Material: How Companies Can Reduce Risk, Orrick (Feb. 9, 2024), https://www.orrick.com/en/Insights/2024/02/AI-Generated-Child-Sexual-Abuse-Material-How-Companies-Can-Reduce-Risk#ref1. This Contribution will use CSAM throughout, unless “child pornography” is directly quoted.

8. Mary Anne Franks, “Revenge Porn” Reform: A View from the Front Lines, 69 Fla. L. Rev. 1251, 1258 (2017).

9. Id.

10. See id. (listing examples of people sharing or using NCP to control others, including sex traffickers controlling sex workers, a person committing rape using NCP to stop the victim from reporting, and nursing home workers posting nude photographs of their patients to social media).

11. NDFP videos of celebrities first appeared on Reddit in 2017 and FakeApp, which makes it much easier to create a deepfake video, was posted by an anonymous user on Reddit shortly after. Rebecca A. Delfino, Pornographic Deepfakes: The Case for Federal Criminalization of Revenge Porn’s Next Tragic Act, 88 Fordham L. Rev. 887, 893–94 (2019) (noting that FakeApp is an AI-assisted technology that “analyzes and manipulates images of a person’s face and then maps it onto a different person’s body in a video” through “an easy five-step process”). It can extract images from the Internet (e.g., by pulling from social media photos and videos), and it is fairly easy for a user to create a deepfake video, but the quality can vary. See, e.g., Roose, supra note 1 (noting that, after eight hours of training AI to put the author’s face on Ryan Gosling’s body, a model produced video that was “blurry and bizarre”); Goggin, supra note 1 (noting the ease and speed in which users can create high-quality deepfake videos).

12. Cade Metz, Internet Companies Prepare to Fight the ‘Deepfake’ Future, N.Y. Times (Nov. 24, 2019), https://www.nytimes.com/2019/11/24/technology/tech-companies-deepfakes.html.

13. Id.; see also Kaylee Williams, Exploring Legal Approaches to Regulating Nonconsensual Deepfake Pornography, Tech Policy Press (May 15, 2023), https://techpolicy.press/exploring-legal-approaches-to-regulating-nonconsensual-deepfake-pornography/ (citing research noting a 31% increase in interest for NDFP web searches over the past year in October 2022).

14. Williams, supra note 13.

15. Samantha Cole, People Are Using AI to Create Fake Porn of Their Friends and Classmates, Vice (Jan. 26, 2018, 2:00 PM), https://www.vice.com/en/article/ev5eba/ai-fake-porn-of-friends-deepfakes (discussing a Discord chatroom, which has since been shut down, with users sharing tips about creating NDFP videos of people they knew).

16. Danielle Keats Citron, Sexual Privacy, 128 Yale L.J. 1870, 1921 (2019).

17. See Delfino, supra note 11, at 897 (noting the similarities of harmful consequences stemming from NCP and NDFP).

18. Franks, supra note 6, at 481–82.

19. See Williams, supra note 13 (noting the emotional harm of knowing you may have to confront the existence of NDFP during social interactions and stating, “Although few studies have explored the individual harms caused by [nonconsensual deepfake pornography] specifically, there is a large body of evidence suggesting that victims of any form of image-based sexual abuse are more likely than non-victims to report adverse mental health consequences like posttraumatic stress disorder, anxiety, depression, and suicidal ideation, as well as challenges keeping or maintaining meaningful employment after being victimized.”); Sarah Beechay, Note, If I Go There Will Be Trouble, If I Stay There Will Be Double: Revenge Porn, Domestic Violence, and Family Offenses, 57 Fam. Ct. Rev. 539, 540–41 (2019) (citing a survey of 361 self-identified victims finding that 42% sought professional mental services, 51% had suicidal ideation, 26% took leave at work or school to cope or pursue legal action, 6% lost their jobs or were expelled from school, and 3% chose to legally change their names); Andrew Gilden, The Queer Limits of Revenge Porn Laws, 64 B.C. L. Rev. 801, 810 (2023) (“Victims [of NCP] have suffered from panic attacks, extreme anxiety, and eating disorders; feared leaving their homes; and withdrawn from social life, online and off.”).

20. Franks, supra note 8, at 1258 (“Nonconsensual pornography often plays a role in intimate partner violence, with abusers using the threat of disclosure to keep their partners from leaving or reporting their abuse to law enforcement.”).

21. See id. at 1283 (“What nonconsensual pornography always involves is an invasion of privacy, and the harm it always inflicts is a loss of privacy.” (emphasis in original)).

22. See Brown, supra note 3, at 13–14 (noting that victims “may doubt their ability to prove that the video is false, and make decisions to comply with the extortion based on their perceived inability to correct the misinformation”).

23. Congressman Joe Morelle has proposed a statute, but it has yet to be adopted. See Press Release, Joe Morelle, Congressman, House of Representatives, Congressman Joe Morelle Authors Legislation to Make AI-Generated Deepfakes Illegal (May 5, 2023), https://morelle.house.gov/media/press-releases/congressman-joe-morelle-authors-legislation-make-ai-generated-deepfakes.

24. 15 U.S.C. § 6851(b)(3)(A)(i).

25. Id. § 6851(b)(3)(A)(ii).

26. Id. § 6851(b)(1)(A).

27. Id. § 6851(a)(3).

28. Id. § 6851(a)(5)(A).

29. Id. § 6851(a)(3).

30. See New York v. Ferber, 458 U.S. 747, 764 (1982) (“When a definable class of material . . . bears so heavily and pervasively on the welfare of children engaged in its production, we think the balance of competing interests is clearly struck and that it is permissible to consider these materials as without the protection of the First Amendment.”).

31. See id. at 761 (noting that “a sexually explicit depiction need not be patently offensive in order to have required the sexual exploitation of a child for its production” and that the Miller v. California, 413 U.S. 15 (1973), standard applied in obscenity cases is not “a satisfactory solution to the child pornography problem”) (internal quotations omitted).

32. See id. (noting the child who was abused would find it irrelevant whether the underlying content had any value).

33. 458 U.S. at 761–63 (finding the obscenity standards enunciated in Miller, 413 U.S. irrelevant in evaluating the harm children face in the production of CSAM).

34. Id. at 757–58 (internal citation omitted).

35. Id. at 758.

36. Id.

37. Id. at 759.

38. Id.

39. Id. at 761.

40. Id. at 762.

41. Id. at 763–64.

42. Id.

43. Id. at 764.

44. See supra text accompanying notes 17–22.

45. See id.

46. See id.

47. See Franks, supra note 8, at 1309–10 (“We do not have two First Amendments, one for civil law and one for criminal law; and it is certainly not the case that the Supreme Court has decided that civil laws categorically raise fewer or less serious First Amendment issues than the latter.”).

48. See Ferber, 458 U.S. at 762.

49. Id. at 764 (“There are . . . limits on the category of child pornography . . . . [T]he conduct to be prohibited must be adequately defined by the applicable state law.”).

50. Id. at 772.

51. Id. at 772–74 (quoting Broadrick v. Oklahoma, 413 U.S. 601, 615–16 (1973)) (“[W]hatever overbreadth may exist should be cured through case-by-case analysis of the fact situations to which its sanctions, assertedly, may not be applied.”).

52. 535 U.S. 234 (2002).

53. Id. at 258.

54. Id. at 239.

55. Id. at 239–40.

56. Id. at 240.

57. Id. at 249.

58. See id. (“Like a defamatory statement, each new publication of the [CSAM] speech would cause new injury to the child’s reputation and emotional well-being.”).

59. Id. at 251.

60. Id. at 254.

61. Id.

62. See id. (“In the case of the material covered by Ferber, the creation of the speech is itself the crime of child abuse . . . .”).

63. Id. at 242.

64. 15 U.S.C. § 6851(b)(1)(A).

65. See id. § 6851(b)(4)(A)–(D). The exceptions include “an intimate image that is commercial pornographic content, unless that content was produced by force, fraud, misrepresentation, or coercion of the depicted individual”; good faith disclosures of the content to law enforcement, in legal proceedings, as part of medical education, or in reporting or investigating unlawful and unsolicited conduct; disclosures for matters of public concern or public interest; and “a disclosure reasonably intended to assist the identifiable individual.” Id.

66. Id. § 6851(b)(1)(A).

67. United States v. Stevens, 559 U.S. 460, 468 (2010).

68. Id. at 470–71.

69. Id. at 471 (quoting Ferber, 458 U.S. at 764) (internal quotations omitted).

70. Id.

71. Id. at 472.

72. Alix Iris Cohen, Note, Nonconsensual Pornography and the First Amendment: A Case for a New Unprotected Category of Speech, 70 U. Miami L. Rev. 300, 313 (2015) (arguing against an originalist approach to First Amendment exceptions because “the Founders had a more limited view of the Free Speech Clause then the general view today” but noting that “even under a historical approach” NCP still has “similarities to historically unprotected speech [that] make it more like a reconfiguration of existing categories, rather than an entirely new one”; thus, it should still be an unprotected category of speech).

73. Franks, supra note 8, at 1317.

74. See Emily Pascale, Note, Deeply Dehumanizing, Degrading, and Violating: Deepfake Pornography and the Path to Legal Recourse, 73 Syracuse L. Rev. 335, 362 (2023) (“By rejecting Stevens’ historical treatment test, the Supreme Court would be free to use Ferber’s rationales to recognize deepfake pornography as an additional category of unprotected speech.”).

75. Ashcroft, 535 U.S. at 242.

76. Id.

77. Compare United States v. Hotaling, 634 F.3d 725, 728 (2d Cir. 2011), and Doe v. Boland, 698 F.3d 877, 885 (6th Cir. 2012), and United States v. Mecham, 950 F.3d 257, 260 (5th Cir, 2020) (all finding that morphed child pornography was unprotected speech), with United States v. Anderson, 759 F.3d 891, 895 (8th Cir. 2014) (rejecting that child pornography morphing constituted unprotected speech because the face was morphed on adult bodies and therefore did not feature child abuse in making the pornography, as was key in Ferber). For a discussion of these cases, see also Pascale, supra note 74, at 353–57 (arguing that “[i]n the years following Ashcroft, a majority of circuits heeded the Court’s suggestion, concluding that morphed child pornography does not deserve First Amendment protections. In doing so, the lower courts began to lay a path for extending Ferber to adult deepfake pornography”).

78. See supra text accompanying notes 17–22 for a discussion of the harms caused.

79. Ashcroft, 535 U.S. at 259 (Thomas, J., concurring) (noting that if “it becomes impossible to enforce actual child pornography laws because the Government cannot prove that certain pornographic images are of real children . . . the Government should not be foreclosed from enacting a regulation of virtual child pornography that contains an appropriate affirmative defense or some other narrowly drawn restriction”).

80. Ashcroft, 535 U.S. at 265 (O’Connor, J., dissenting) (further noting that “[r]eading the statute only to bar images that are virtually indistinguishable from actual children would not only assure that the ban on virtual child pornography is narrowly tailored, but would also assuage any fears that the ‘appears to be . . . of a minor’ language is vague”).

81. Berisha v. Lawson, 141 S. Ct. 2424, 2428–29 (2021) (Gorsuch, J., dissenting from denial of certiorari) (quoting in part Gertz v. Robert Welch, 418 U.S. 323, 364 (1974)) (arguing that the very high actual malice standard required for defamation cases against public officials should be revisited in part because the rise of technology has broadened online presence so that “‘voluntarily or not, we are all public [figures] to some degree’” which “has come to leave far more people without redress than anyone could have predicted”).

82. See Pascale, supra note 74, at 362–63 (arguing that, even though these Supreme Court Justices’ statements were made in the context of CSAM and defamation, they suggest the Court’s potential willingness to uphold regulations on technological advances).

83. Franks, supra note 8, at 1312 (noting that “the few categories that the Court has explicitly determined not to receive First Amendment protection” are not exhaustive).

84. See 15 U.S.C. § 6851(b)(2)(A) (focusing on the nonconsensual nature of the distribution regardless of whether consent existed for the creation).

85. See Franks, supra note 8, at 1318 (noting that regulations are content-neutral when they do not address the content of the speech).

86. Cf. id. at 1317 (arguing that the author’s model NCP legislation, which is similar to 15 U.S.C. § 6851, would not impinge on political speech).

87. Cf. Dun & Bradstreet, Inc. v. Greenmoss Builders, 472 U.S. 749, 758–59 (1985) (“It is speech on ‘matters of public concern’ that is ‘at the heart of the First Amendment’s protection.’” (quoting First Nat’l Bank of Boston v. Bellotti, 435 U.S. 765, 776 (1978))).

88. Packingham v. North Carolina, 582 U.S. 98, 105–06 (2017) (citations omitted) (quoting McCullen v. Coakley, 573 U.S. 464, 486 (2014)).

89. See supra text accompanying notes 17–22.

90. See Franks, supra note 8, at 1323 (noting the privacy and personal interests at stake and concluding NCP regulations written with the same protections as 15 U.S.C. § 6851 withstand intermediate scrutiny).

91. See footnote 385 of Franks, supra note 8, 1323 for a discussion of one such scholar’s argument.

92. Cf. id. at 1323–24 (noting that statutes modeled after the article’s proposed statute, like 15 U.S.C. § 6851, are not viewpoint-based but focus on harm in disclosure and therefore do not trigger strict scrutiny).

93. See supra text accompanying notes 17–22.

94. Franks, supra note 8, at 1321–26 (noting legal restrictions for protecting privacy, disclosing personally identifying information, criminalizing voyeurism, and unlawful surveillance, and “common-law protections against publicizing the private life of another”).

95. United States v. Playboy Entm’t Grp., 529 U.S. 803, 813 (2000).

96. Anderson, 759 F.3d at 896 (alteration in original) (quoting Ashcroft, 535 U.S. at 250).

97. Id.

98. Id.

99. See supra text accompanying notes 17–22.

100. Franks, supra note 8, at 1311 (“While such [privacy] laws can be controversial in some cases, there appears to be general agreement that protecting sensitive information like medical records or Social Security numbers is something the law can and should do.”).

101. Citron, supra note 16, at 1874.

102. Michaels v. Internet Entm’t Grp., Inc., 5 F. Supp. 2d 823, 840 (C.D. Cal. 1998) (“[V]isual and aural details of . . . sexual relations[] [are] facts which are ordinarily considered private even for celebrities.”).

103. Cohen, supra note 72, at 321 (2015) (“[A]llowing privacy to be repeatedly invaded is what ultimately would stop people from speaking freely, thus diminishing ideas in the marketplace, hindering participatory democracy, and reducing autonomy.”).

104. Franks, supra note 8, at 1321–22.

105. Citron, supra note 16, at 1921 (“Much like nonconsensual pornography, deep-fake sex videos exercise dominion over people’s sexuality, exhibiting it to others without consent.”).

106. See Franks, supra note 8, at 1323 (stating in relation to NCP that “[c]ombating this form of sex discrimination is not only consistent with longstanding First Amendment principles, but comports with equally important Fourteenth Amendment equal protection principles”); Citron, supra note 16, at 1874 (noting that NDFP most often affects “women, nonwhites, sexual minorities, and minors”).

107. Ferber, 458 U.S. at 763–64.

108. See id. at 772–74 (“Whatever overbreadth may exist should be cured through case-by-case analysis of the fact situations to which its sanctions, assertedly, may not be applied.” (quoting Broadrick, 413 U.S. at 615–16)).