
How AI is being used to create deepfakes that harm children
Clip: 3/22/2025 | 5m 7sVideo has Closed Captions
How AI is being used to create explicit deepfake images that harm children
A new report offers a troubling look at the latest digital threat to young people: deepfake nudes. These are realistic-looking photos and videos that have been altered using AI technology to depict subjects in sexually explicit situations and then spread online. Stephanie Sy speaks with Melissa Stroebel at Thorn, a nonprofit dedicated to protecting children online, to learn more.
Major corporate funding for the PBS News Hour is provided by BDO, BNSF, Consumer Cellular, American Cruise Lines, and Raymond James. Funding for the PBS NewsHour Weekend is provided by...

How AI is being used to create deepfakes that harm children
Clip: 3/22/2025 | 5m 7sVideo has Closed Captions
A new report offers a troubling look at the latest digital threat to young people: deepfake nudes. These are realistic-looking photos and videos that have been altered using AI technology to depict subjects in sexually explicit situations and then spread online. Stephanie Sy speaks with Melissa Stroebel at Thorn, a nonprofit dedicated to protecting children online, to learn more.
How to Watch PBS News Hour
PBS News Hour is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipJOHN YANG: A new report offers a troubling look at the latest digital threat to young people, Deepfake nudes.
These are realistic photos and videos that have been altered using AI technology to depict the subjects in sexually explicit situations and then spread online.
Stephanie Sy spoke with Melissa Stroebel.
She's vice president of research and insights at THORN, a nonprofit dedicated to protecting children online.
STEPHANIE SY: Melissa, thank you so much for joining us.
Before we get to these findings, could you start by just explaining what deepfake nudes are?
MELISSA STROEBEL, VP of Research And Insights, THORN: Absolutely.
So deepfake nudes are synthetic media creations that depend a real person in a sexually suggestive or explicit situation or activity.
STEPHANIE SY: What were the main findings in this report?
Who was impacted the most by the creation and spread of these deepfakes?
MELISSA STROEBEL: So what we heard from young people in this survey, and we surveyed about 1,200 13 to 20-year olds, is unfortunately, this is becoming all too common of an experience within their landscape and growing up in particular.
One in eight told us that they knew someone who had been particularly targeted specifically targeted by and impacted by deepfake nudes.
On top of that, we heard that one in 17 themselves had been targeted by deepfake nude abuse.
Now, sometimes that number can feel a little bit small.
One in 17, that's a small percentage.
But when we think about what 17 looks like in our communities, that's the size of our kids classroom, that's the size of their soccer team.
So this is really far too high a number for kids to be experiencing this type of victimization.
The other important thing that came through here was about the availability of this technology.
And for the kids who told us that they had created deepfake news of someone else, the technology is really readily available.
It's available through social media, through browsers, and through app stores.
But the good news is that most kids realize this is a harmful behavior.
And so there's a lot of opportunity for us to be having conversations with them and reinforcing those perspectives.
STEPHANIE SY: You know, one interesting part of your research that I read were the responses from some of these teens.
Even though we call these manipulated images deep fakes, in a way they're very real, right, especially if you're a victim of deep fakes.
MELISSA STROEBEL: Yeah, very true.
For those young people who've experienced deepfake nude abuse, they have shared with us stories of severe anxiety, fear, shame, as well as worries that they won't believed or that their experiences will be dismissed because of the involvement of AI generative technologies.
STEPHANIE SY: What is being done about AI safety right now?
What should being done?
Because we're basically talking about minor sexual abuse.
MELISSA STROEBEL: This is absolutely right.
At the end of the day, whether AI generative technologies was involved or not, an explicit image of a minor is still child sexual abuse material.
And that's a really important starting point.
There are responses across the ecosystem taking place.
For example, our organization has been working directly with tech companies to make sure that these models are being built as safely as possible.
And if they are aware of abusive models, that those are being made unavailable so that they can't be so readily accessed.
But there's a lot more work to be done.
Right now we don't have a consistent institutional response that's reinforcing what kids already suspect.
And that's an opportunity where we need to lean in and offer more guidance for kids.
STEPHANIE SY: In the meantime, what should parents and other caretakers be doing to make sure that their teens are safe?
MELISSA STROEBEL: Having really open and early conversations with the young people in our lives is going to be one of the most important steps we can be doing at home.
Make sure that kids understand this is not a joke, it is not funny and it carries real consequences for the kids that are being targeted.
Naming that openly and directly at home is an important first step, but we can be doing more for other adults within our communities, such as within schools.
There's a lot of need for there to be clear guidance within schools for their school bodies that this is not permissible behavior.
There are student handbooks in place that address things like harassment.
This is something that we can lean.
Into and acknowledge this type of emerging risk and have policies in place that make sure that schools are prepared to respond in a victim centered way.
STEPHANIE SY: Melissa Stroebel, VP of THORN's Research and Insights.
Thank you so much for joining us.
MELISSA STROEBEL: Thank you.
The effect of NIH funding cuts on vaccine hesitancy research
Video has Closed Captions
The effect of NIH funding cuts on vaccine access and hesitancy research (5m)
Inside the making of a quarter celebrating Ida B. Wells
Video has Closed Captions
A look inside the U.S. Mint’s creation of a quarter celebrating Ida B. Wells (6m 22s)
News Wrap: Russia strikes Ukraine despite limited ceasefire
Video has Closed Captions
News Wrap: Russian drones strike Ukraine despite limited ceasefire agreement (2m 21s)
What’s behind renewed conflict on Israeli-Lebanese border
Video has Closed Captions
Violence flares on Israeli-Lebanese border as Israel steps up attacks in Gaza (4m 29s)
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipMajor corporate funding for the PBS News Hour is provided by BDO, BNSF, Consumer Cellular, American Cruise Lines, and Raymond James. Funding for the PBS NewsHour Weekend is provided by...