FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Science & Health

Study Finds Teen Girls Using AI For Sexual Images

Pam Belluck
Last updated: March 18, 2026 7:10 pm
By Pam Belluck
Science & Health
7 Min Read
SHARE

A peer-reviewed study has found that teen girls are using AI-powered “nudification” tools to create sexualized images at roughly the same rate as teen boys, a result that challenges long-standing assumptions about who engages most with sexual content online and raises urgent questions for parents, platforms, and policymakers.

Published in PLOS One and led by George Mason University digital forensics researcher Dr. Chad M.S. Steel, the survey of English-speaking adolescents ages 13 to 17 paints a picture of rapid normalization: many teens are experimenting with tools that remove clothing or generate explicit composites from everyday photos, and a sizable share report being victimized by non-consensual creations or sharing.

Table of Contents
  • What the study found about teen use of AI nudification
  • Why girls are turning to nudification and similar AI tools
  • The scale of harm and the legal blind spots for minors
  • What parents and platforms can do to reduce AI image harm
A young woman taking a selfie in a mirror, holding a white smartphone to her face.

What the study found about teen use of AI nudification

Steel’s January 2025 online survey of 557 teens found that 55% had created a sexualized image using AI or related editing tools, and 54% had received one. More than a third said a non-consensual image of them had been generated, and about a third reported an image of theirs was shared without permission—evidence that harm is not incidental but widespread.

Usage parity stood out. Contrary to patterns in sexting and pornography consumption where boys often outpace girls, teen girls reported using nudification tools at similar rates. Approximately 1 in 6 girls and boys said they used these apps frequently to “see how they looked.” Girls were about as likely as boys to have shared such images “once or twice” with someone else.

There were gendered differences in misuse: boys reported higher use of generative AI to create or distribute sexual images of others, both with and without the subject’s consent. That finding tracks with broader research on image-based abuse, where perpetrators skew male while victims skew female.

Steel emphasized the need for replication with a larger sample. Independent experts, including Linda Charmaraman of Wellesley College’s Youth, Media, & Wellbeing Research Lab, noted that while the study used quality checks and sought a nationally representative sample, online recruitment can tilt toward tech-savvy participants. Even so, the consistency of the trends suggests a fast-moving cultural shift.

Why girls are turning to nudification and similar AI tools

The study did not ask teens why they used these tools, but researchers point to a confluence of forces. Girls spend significant time with “try-on” and beauty filters that manipulate appearance in realistic ways; nudification apps rely on similar interfaces, lowering the barrier to experimentation. In parallel, peer dynamics and coercion—well documented in adolescent sexting—can nudge behavior under the guise of fitting in, flirting, or avoiding social penalties.

Adolescence is also a period of identity exploration. When frictionless AI promises private, instant previews of one’s body, teens may misread the risks, especially if they believe synthetic images are less “real” or less likely to spread. That perception is dangerously outdated in the era of one-tap sharing, anonymous accounts, and opportunistic predators.

The scale of harm and the legal blind spots for minors

Experts stress a critical legal reality: any sexual image of a minor—including AI-generated or self-produced content—can be classified as child sexual abuse material. Teens often do not grasp that line, and enforcement around consensual peer-to-peer sharing is inconsistent, but the risk of lifelong consequences remains. Non-consensual creation or distribution is frequently a separate offense under image-based abuse or deepfake laws.

A persons hands adjust the face of a realistic humanoid robot head, with another robot head mechanism visible to the right.

Warning signs are mounting beyond this single study. The National Center for Missing & Exploited Children’s CyberTipline received 36.2 million reports in 2023, and analysts there have cautioned that AI is accelerating the volume and sophistication of manipulated content. The Internet Watch Foundation and child-safety nonprofits have similarly flagged a surge in AI-fabricated child sexual abuse material, complicating detection and takedown efforts.

These dynamics feed into a growing sextortion threat. The FBI has issued repeated alerts about schemes targeting minors, where offenders use stolen or fabricated images to coerce additional content or money. Because nudification can produce convincing fakes from ordinary photos, teens with public profiles are especially exposed—even if they never generated explicit images themselves.

What parents and platforms can do to reduce AI image harm

Researchers recommend frank, nonjudgmental conversations at home that focus on consent, legality, and the permanence of digital sharing, rather than blanket abstinence messages that teens are likely to tune out. Ask what your teen is seeing, what pressures they feel, and how they would respond if a peer requested—or threatened to share—sexualized images. Normalize help-seeking if something goes wrong.

Practical steps matter:

  • Keep social accounts private.
  • Limit followers to people your teen actually knows.
  • Discuss the risks of reposts and screenshots.
  • If images are shared without consent, contact school officials if relevant.
  • Preserve evidence.
  • Report to platforms and appropriate authorities.

If images are shared without consent, contact school officials if relevant, preserve evidence, and report to platforms and appropriate authorities. NCMEC’s Take It Down service allows minors to create a unique digital fingerprint of intimate images to help participating platforms find and remove copies.

Responsibility does not rest solely with families. Safety researchers urge platforms to build guardrails by default:

  • Detect and block nudification pipelines.
  • Add friction and clear warnings before uploads that could be sexualized.
  • Enable minor accounts to disable resharing.
  • Expand fast-track reporting for image-based abuse.

Policymakers are weighing “duty of care” standards and bystander education models so teens learn to intervene when peers plan to create or spread AI sexual imagery. The study’s most sobering takeaway is that nudification is already part of teen digital life; the window for proactive safeguards is now.

Pam Belluck
ByPam Belluck
Pam Belluck is a seasoned health and science journalist whose work explores the impact of medicine, policy, and innovation on individuals and society. She has reported extensively on topics like reproductive health, long-term illness, brain science, and public health, with a focus on both complex medical developments and human-centered narratives. Her writing bridges investigative depth with accessible storytelling, often covering issues at the intersection of science, ethics, and personal experience. Pam continues to examine the evolving challenges in health and medicine across global and local contexts.
Latest News
Oracle Cloud ERP Outage Sparks Renewed Debate Over Vendor Lock-In Risks
Why Digital Privacy Has Become a Mainstream Concern for Everyday Users
The Business Case For A Single API Connection In Digital Entertainment
Why Skins and Custom Servers Make Minecraft Bedrock Feel More Alive
Why Server Quality Matters More Than You Think in Minecraft
Smart Protection for Modern Vehicles: A Guide to Extended Warranty Coverage
Making Divorce Easier with the Right Legal Support
What to Know Before Buying New Glasses
8 Key Features to Look for in a Modern Payroll Platform
How to Refinance a Motorcycle Loan
GDC 2026: AviaGames Driving Innovation in Skill-Based Mobile Gaming
Best Dumbbell Sets for Strength Training: An All-Time Buyer’s Guide
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.