My Blog
Entrepreneur

Scammers Are Using AI to Clone Your Loved One’s Voice


AI has the potential to make our lives a lot easier, allowing us to multi-task and save time. But its sophistication can also be used against us — by other people.

In an increasingly widespread scam, bad actors are cloning voices of people’s loved ones with AI; they call their victims on the phone and use the voice to ask for money under false pretenses, NBC Nightly News reported.

Related: 4 Tips to Spot a Remote Work Job Scam, According to an Expert

One father interviewed by the outlet revealed that he got a call he thought was from his daughter, saying she’d been kidnapped and was being held for ransom. He was so convinced he grabbed cash and drove to a meetup location before his wife called his actual daughter — and discovered it was a scam.

Last year, reported fraud losses increased 30% year over year to nearly $8.8 billion, and there were more than 36,000 reports of people being scammed by those pretending to be friends and family, according to data from the Federal Trade Commission.

Perpetrators of phone scams can pull voice snippets from social media — then use them to wreak havoc.

AI voice-generating software can decipher what makes a person’s voice distinct — including age, gender and accent — then sift through an enormous database of voices to locate similar ones and find patterns, Hany Farid, a professor of digital forensics at the University of California at Berkeley, told The Washington Post.

Related: Retired Teacher Loses $200k in Wire Fraud Email Scam

The Federal Trade Commision is urging people to watch out for calls using voice clones; if a call from a loved one seems suspicious, hang up and call the person yourself to verify the claim.



Related posts

Why Leaders Must Encourage Creativity in the Workplace

newsconquest

Become a Data Management Pro by Becoming Proficient in Microsoft Excel

newsconquest

How To Improve Your Retention Rate And Save Lost Sales

newsconquest

Leave a Comment