A father in Texas is sharing a warning to other families about artificial intelligence scams after he said his father was scammed out of $1,000.
“When you're in anxiety and panic, when you're there to help your family, I think logic gets thrown out the window,” Lee Hall told " Good Morning America." "That's what the scammers are betting on, so now we have to raise our level of sophistication so these things won't happen to our loved ones."
Hall said his father, whom he didn’t name, received a distressed call he thought was from his grandson, Christian, who is Hall's son.
“He told his grandfather that he was vacationing in Mexico with his friends. Got in trouble. Scared to death. He got into a little bit of drinking and got into an accident and now that's the reason why he's in a little bit of trouble and he needs some money to get him out of that situation," Hall said of the call his father received, believing it was from Christian.
The scammer allegedly used artificial intelligence, known as AI, to impersonate Christian’s voice to plead for help and money, according to Hall.
Hall said his father sent $1,000 to the alleged scammer.
"That story is very believable," Hall said of the call his father received. "What's scary is my father is totally off the grid. He doesn't have any social media accounts. He doesn't even have email. But the fact that they still connected my son, that makes it even scarier."
Hall said he and his wife confirmed it was a scam when they called their son, a college student.
“I just knew that that was a scam because we knew that [Christian] was okay," he said. "We knew he was in Dallas, so we called him right away and he said, ‘Yes, mom, I'm in college in Dallas,’ so we knew it was a scam."
Last month, Jennifer DeStefano, a mother in Arizona, told "GMA" how she also received an alarming call that turned out to be an AI voice-cloning scheme.
“It's my daughter’s voice crying and sobbing, saying, ‘Mom,’” said DeStefano. “And I'm like, ‘Okay, what happened?’ She's like, ‘Mom, these bad men have me help me, help me.’"
Then, DeStefano said a man demanded she pay a ransom in exchange for Briana's safe release. But he told her he didn't want a wire transfer for the ransom and wanted to go and pick her up instead.
Luckily, DeStefano was able to confirm her daughter was safe within minutes.
The Federal Trade Commission said scammers are easily able to use AI to clone voices with a short audio clip often found on social media.
- 3December 16, 2019
According to the FTC, impersonator scams were the highest reported scams in 2022, resulting in $2.6 billion in losses.
How to protect your family against AI scams
Samuel Levine, the director of the FTC, said the scam is so effective because it imitates something personal and can trigger hasty action. Instead, he said to remain calm and contact the authorities.
“They're exploiting the connection we have to our relatives and our friends to try to get people to turn over money or turn over personal information in a way that can cause a lot of harm,” he said.
Experts say that if a caller is demanding a wire transfer, a gift card or cryptocurrency, that's a major red flag it could be a scam.
Experts also advise keeping social media accounts private so scammers cannot replicate your voice from a post.
Because scammers often spoof phone numbers, experts say that if you receive one of these calls, hang up and call your loved one directly instead of just hitting redial.
Experts also recommend that families create a code word that family members can say to each other if there is really trouble.