When it comes to keeping your children safe in schools and online, a national non-profit is working to remove explicit images of minors being shared.
The National Center for Missing & Exploited Children gave Eyewitness News the latest number of reports submitted to its cyber tip line, which shows that the number of submissions is continuing to trend up.
When you add in the ability to manipulate photos and videos using artificial intelligence, it appears the data will only keep going up.
ABC13 is looking into how the organization is helping teens who may have non-consensual or risqué photos being shared online.
ORIGINAL REPORT: 'Take It Down' initiative aims to help teens with non-consensual photos online
The initiative, called "Take It Down," was launched in 2022 by the NCMEC.
Jennifer Newman, the executive director of the exploited children's division, says it's simple to use and critical to any teenager or victim who has a non-consensual image being shared online.
"A child maybe took a picture of themselves and meant to just send it to a friend or to a boyfriend. Unfortunately, it's been now shared way past what they originally intended. Or if they've been enticed and someone to, you know, blackmailed them or sextorted them into taking a picture. And now that picture's out there, all they need to go need to do is go to 'Take It Down,'" Newman said.
You can remain anonymous, and Newman says it's important to mention that your image is not uploaded to the site. The photo will never even leave your device.
Newman explained how the platform grabs what's called a hash value or "digital fingerprint." Then, partner companies, including law enforcement, can use that hash value to make sure the picture isn't circulated online.
Newman says that more than 227,000 hashes have been submitted since the program launched in 2022, but she says that with technology rapidly changing, the organization is now examining what kind of impact generative AI will have going forward.
"We've seen where offenders are uploading images of existing child sexual abuse, imagery, and just tweaking it, making it different, perhaps changing the gender of the child. It's awful stuff. And then you know what we're seeing a lot of is almost this peer abuse in these notify and unclothed apps. There might be a picture that a child posted, you know, several years ago from prom, and somehow that picture now the dress is removed using generative AI. It looks very realistic. It looks like a nude photo of that child, which is as damning and can be very psychologically traumatic for these victims," Newman said.