Expert shares tips on how to respond to deepfake AI scams: 'Really scary'

Rita Garcia Image
Wednesday, May 29, 2024
Deepfake AI scams to look out for and how to protect yourself
Cybersecurity expert Mitchem Boles shared ways to avoid falling victim to deepfake artificial intelligence scams.

HOUSTON, Texas (KTRK) -- ABC13 is looking into "deepfake AI" and how it can manipulate your voice to create something convincing enough to fool your own family.

This is just one of the latest AI-generated scams, so here's what to look out for so you can protect yourself.

Mitchem Boles, a cybersecurity expert, says artificial intelligence is not only here to stay but is also getting more advanced.

He says that means we need to stay ahead of it to avoid falling victim to any AI scams, including deepfakes.

Boles explained that there are a number of websites and apps that allow you to insert voice samples before they synthesize and create a text or a voice message. He showed us this by using software he easily downloaded.

"You're able to type in anything that you would like here in the text box," Boles said. "You can say anything you want. And that's what is really scary about this. And so when you're looking to generate it, it takes a second."

RELATED: How scammers use artificial intelligence to target search engine users

Experts say even the top search results on Google and other search engines can be fraudulent. Here's what to look out for.

Mitchem typed into the text box, "This is a fake AI version of my voice. This is not real."

He then clicked "generate" and let us listen to what it sounded like before taking the same voice sample and creating a voice message that was completely different from what was originally recorded.

"Now, there is some differentiation between each version as you type and generate again. But it's so legitimately sounding, especially on the phone. And so, hopefully, that sounds like my voice. I think it does a little bit," Boles said.

It only takes three seconds of audio to manipulate someone's voice using AI, but Boles says the more audio you give it, the more you can dial in on someone's voice so that it's almost undetectable compared to the person's real voice.

To protect yourself, Boles says you should know that scammers typically check three boxes in their messages or phone calls.

They'll play on your emotions and add urgency and familiarity. Your best bet is to hang up on the caller and call your loved one right away.

Boles also suggests having a family "code word" as a line of defense.

"When this information comes at you quickly, it's hard to digest it. Have that code word ready. Talk about it with your loved ones, with your family members, and have that ready so that you can use it in case you do legitimately need money or some other service that you need help with," Boles suggests.

SEE ALSO: Texas fights to preserve Democracy amid rise of deepfakes in elections

Texas was the first state in the nation, and now one of 11, to legislate against deepfakes. Whether video, audio, or images, it's up to you, the voter, to take the extra step and confirm the deception yourself.