National nonprofit has message for anyone creating child sexual abuse material using AI

Rita Garcia Image
Friday, November 15, 2024
Nonprofit has this message as it sees rise in production of deepfakes
The National Center for Missing and Exploited Children has a strong message for anyone creating child sexual abuse material using AI.

The National Center for Missing and Exploited Children is seeing a rise in the number of calls when it comes to peer-to-peer abuse with the production of deepfakes, and what some may not know, this can also be considered a form of child sexual abuse material or CSAM.

A spokesperson with the national nonprofit said they have a strong message for any and all creators who are now throwing artificial intelligence into the mix.

Stacy Garrett, the vice president of content and community engagement with NCMEC, said people should know there is no difference between producing child sexual abuse material or creating it using AI.

She said it's not only harmful but illegal.

Unfortunately, she said it's also a crime they're seeing more and more of with the internet and all sorts of apps easily accessible.

SEE ALSO: National nonprofit continues helping teens scrub deepfakes or nonconsensual explicit images

The National Center for Missing & Exploited Children's "Take It Down" initiative continues to help teens scrub deepfakes or nonconsensual photos.

"I think sometimes it's intended as a prank. Maybe it's a joke," Garrett said. "Usually, what we're seeing used are images that are available on kids' social media accounts or even school photos that have been taken. Then, with the use of AI programs that are online that are readily accessible, they're creating nude or sexually explicit images of their peers in many cases."

Garrett also gave an example of how even a school yearbook photo can be weaponized these days. Because they are seeing an uptick in these types of crimes reported to their tip line, Garrett said they have this message for anyone putting out this harmful content.

"I would want kids to know that they should not circulate that content, even if they're not the creator," Garrett said. "They can still be an upstander and not a bystander by ensuring that if it's shared with them, it stops there and that it doesn't go any further. And then, I think the other important message - and this one is maybe the most important, is for victims. They should know that there is help out there, that they have resources that they can turn to for help, whether it's reporting it to the cyber tip line at the National Center or using services like 'Take It Down,' which can assist with the removal of explicit or sexual content created before someone was 18 years old."

In terms of consequences, while organizers recognize legislation still needs to catch up, given the evolving digital world, in some cases, Garrett said they've seen a strong response from schools like expulsions, which can obviously impact a student in the long run.

ABC13 put together a list of available resources, including the "Take It Down" platform, which launched in 2022. On this platform, teenagers can anonymously report any non-consensual image shared online.

Teens should know that the photo they're reporting never leaves their device. You don't have to upload anything. Instead, the platform will grab what's called a "hash-value" or "digital fingerprint," and from there, the NCMEC, along with its partner companies, including law enforcement, will use it to make sure the picture isn't circulated.

The nonprofit also has a cyber tip line on its website. It'll take you to a list of questions that you can fill out before submitting.

Lastly, iWatch Texas is another resource.

This is a state system where you can report suspicious activity or school threats. In fact, Texas DPS says it has seen a rise in the number of reports being submitted and believes it's because people are becoming more aware of the online tool.

SEE RELATED: 'Take It Down' initiative aims to help teens with non-consensual photos online

The platform by the National Center for Missing & Exploited Children works to ensure non-consensual pictures aren't being circulated online.
Copyright © 2024 KTRK-TV. All Rights Reserved.