EU Proposals Will Fail to Curb Nonconsensual Deepfake Porn

deepfake porn
deepfake porn

Existing and proposed laws will fail to protect EU citizens from nonconsensual pornographic deepfakes—AI-generated images, audio, or videos that use an individual’s likeness to create pornographic material without the individual’s consent. Policymakers should amend current legislative proposals to better protect victims and, in the meantime, encourage soft law approaches.

Although deepfakes can have legitimate commercial uses (for instance, in film or gaming), 96 percent of deepfake videos found online are nonconsensual pornography. Perpetrators superimpose the likeness of an individual—most often an actor or musician, and almost always a woman—onto sexual material without permission. Sometimes perpetrators share these deepfakes for purely lewd purposes, while other times it is to harass, extort, offend, defame, or embarrass individuals. With the increasing availability of AI tools, it has become easier to create and distribute deepfake nonconsensual pornography.

There are no specific laws protecting victims of nonconsensual deepfake pornography, and new proposals will fall short.

The Digital Services Act (DSA) obliges platforms to demonstrate the procedures by which illegal content can be reported and taken down. However, this will have little impact on the spread of nonconsensual pornographic deepfakes since the bill fails to classify nonconsensual deepfakes as illegal. The DSA obliges the largest platforms to undertake risk assessments, deploy mitigation measures, and subject themselves to audits to ensure they enforce their terms and conditions—but 94 percent of deep fake pornography is hosted on dedicated pornographic websites, not on mainstream platforms (mainstream platforms have already adopted policies to stop the spread of this content). And the EU dropped a proposal in the DSA during last-minute negotiations that required porn sites hosting user-generated content to swiftly remove material flagged by victims as depicting them without permission.

The Artificial Intelligence (AI) Act, likely to pass into law in 2023, requires creators to disclose deepfake content. But, in cases of well-known individuals, disclosure of pornographic deepfakes would hardly deter perpetrators, since the demand for the content does not depend on its authenticity, nor would it surprise viewers who may have already assumed the content to be fake.