Study Finds Majority of U.S. Teens Use AI to Create Nude Images
Background
A research team led by Chad Steel of George Mason University conducted an anonymous survey of 557 English‑speaking U.S. residents between the ages of 13 and 17. The study, published in the open‑access journal PLOS ONE, was carried out with parental consent and focused on the use of artificial‑intelligence (AI) tools for creating sexualized images.
Key Findings
The results show that 55.3% of teens surveyed reported using nudification tools to create at least one image of themselves or others. A similar proportion, 54.4%, said they had received AI‑generated nude images. Moreover, 36.3% indicated that a sexualized AI image of themselves had been created by someone else without their consent, and 33.2% reported that those images were shared without permission.
Male participants reported higher rates of both creating and distributing these images, whether consensually or non‑consensually. The study notes that the prevalence of these behaviors was largely consistent across different demographic groups.
Implications and Reactions
Researchers warn that AI‑nudification tools remove the need for a willing participant, allowing anyone with a photo and access to an app to generate a fake nude image. Victims experience consequences similar to other forms of child sexual exploitation material, including a sense of dehumanization and lasting disruption to their lives.
Steel summed up the shift, stating, "Teens are no longer just digital natives but AI‑natives. ‘Nudification’ and GenAI apps are their new ‘sexting,’ only with more challenging issues surrounding consent." The study’s authors hope the findings will prompt lawmakers and educators to address the problem before it becomes even more difficult to manage.
Industry Context
The report arrives amid broader debates about adult content generation in AI. OpenAI CEO Sam Altman is pushing to release an “adult version” of ChatGPT, while Elon Musk has advocated allowing Grok to generate R‑rated content. These discussions underscore the growing concern about how AI‑generated sexual content may be used and regulated.
Used: News Factory APP - news discovery and automation - ChatGPT for Business