With the spread of generative AI, a new phenomenon has emerged: non-consensual AI pornography (Non-consensual Deepfake Pornography, NDP). This refers to pornographic images and videos in which the likeness of uninvolved individuals is inserted without their consent using AI technology — a form of image-based sexual abuse (IBSA).
A recent study by the Media Psychology and Media Design Group at TU Ilmenau analyzed 134 Reddit discussions on the topic, examining posts from the perspectives of perpetrators, victims, and bystanders. The findings show that the largest share of discussions was initiated by bystanders (45.5%), followed by victims (32.1%) and perpetrators (22.4%). Victims were predominantly women (85.8%), while perpetrators were predominantly men (81.3%), who frequently targeted individuals from within their close social networks.
Online communities such as Reddit forums offer support and moral guidance on one hand, but also contribute to the minimization of the phenomenon on the other. Multi-level interventions are therefore essential: from psychosocial support and legal resources to digital media literacy and public education on consent and technology-facilitated abuse.
- Le, T. D. & Döring, N. (2026). Perspectives on Non-Consensual Deepfake Pornography: A Content Analysis of Reddit Discussions Initiated by Perpetrators, Victims, and Bystanders. Sexuality & Culture. https://doi.org/10.1007/s12119-026-10566-x

