The National Center for Missing and Exploited Children is seeing a rise in the number of calls when it comes to peer-to-peer abuse with the production of deepfakes, and what some may not know, this can also be considered a form of child sexual abuse material or CSAM.
A spokesperson with the national nonprofit said they have a strong message for any and all creators who are now throwing artificial intelligence into the mix.
Stacy Garrett, the vice president of content and community engagement with NCMEC, said people should know there is no difference between producing child sexual abuse material or creating it using AI.
She said it's not only harmful but illegal.
Unfortunately, she said it's also a crime they're seeing more and more of with the internet and all sorts of apps easily accessible.
SEE ALSO: National nonprofit continues helping teens scrub deepfakes or nonconsensual explicit images
"I think sometimes it's intended as a prank. Maybe it's a joke," Garrett said. "Usually, what we're seeing used are images that are available on kids' social media accounts or even school photos that have been taken. Then, with the use of AI programs that are online that are readily accessible, they're creating nude or sexually explicit images of their peers in many cases."
Garrett also gave an example of how even a school yearbook photo can be weaponized these days. Because they are seeing an uptick in these types of crimes reported to their tip line, Garrett said they have this message for anyone putting out this harmful content.
"I would want kids to know that they should not circulate that content, even if they're not the creator," Garrett said. "They can still be an upstander and not a bystander by ensuring that if it's shared with them, it stops there and that it doesn't go any further. And then, I think the other important message - and this one is maybe the most important, is for victims. They should know that there is help out there, that they have resources that they can turn to for help, whether it's reporting it to the cyber tip line at the National Center or using services like 'Take It Down,' which can assist with the removal of explicit or sexual content created before someone was 18 years old."
In terms of consequences, while organizers recognize legislation still needs to catch up, given the evolving digital world, in some cases, Garrett said they've seen a strong response from schools like expulsions, which can obviously impact a student in the long run.