The warning of an expert indicates an increase in the occurrence of scams related to AI voice cloning.

The warning of an expert indicates an increase in the occurrence of scams related to AI voice cloning.

Scammers are increasingly using artificial intelligence (AI) tools to mimic the voice of the person they are targeting on social media to make panicked phone calls to family members and friends in an attempt to induce the unwitting caller to provide access to money or sensitive information. Mike Shaumack, chief innovation officer at IdentityIQ, an identity theft prevention and credit score monitoring firm, told FOX Business, “We’re seeing a lot of these calls being made by people who are trying to get access to their financial and confidential information. We’ve seen AI slowly start to enter these areas of cybercrime and we’ve seen a sudden and rapid increase over the last year or so.

‘In advanced and targeted phishing scams, AI is being used to generate very specific emails, with wording that identifies who the target is. “AI-based voice clone scams have also been on the rise over the past year, making this a very frightening topic. Scammers who commit voice clone fraud record people’s voices or find audio clips on social media or the Internet. Skewmack explains, “All it takes is three to ten seconds to get a very realistic cloned voice.” The voice samples are then run through an AI program to duplicate the voice. The scammer can make the cloned voice speak whatever he or she inputs, and can even add emotions such as laughter or fear to the cloned voice, depending on the script of the scam.

To demonstrate how sophisticated the AI voice cloning program is, IdentityIQ pulled an audio sample from an interview the author of this article gave this spring on the Fox News Rundown podcast. The audio sample was used to create an AI voice clone of a phone call to a panicked family demanding a cash app transfer after a fictitious car accident: The voice sample was then cloned to say: “Mom, I need to talk to you. Mom, I need to talk to you. I was supposed to go interview someone today, but I got into a car accident. I’m okay, but I need your help right now. I hit the bumper of the other car. He wants $1,000 for repairs or he will call the police and report it to my insurance. I need the money now; can you send me $1,000 in Zelle? I’ll show you how to do it,” the voice clone said.

Scheumack noted that voice clone calls from scammers are usually shorter than this example, and when conveying requests for money, access to accounts, or other sensitive information, they may try to cut off any possibility of conversation by saying something like “I can’t talk now. The goal of the scammer is to put you in a state of fight or flight, to instill in your mind a sense of urgency that your loved one is in some kind of trouble. So the best way to deal with such a situation is to hang up the phone and immediately call your loved one to see if it is him or her,” he explained.

Scheumack cited an example of a recent interview conducted by IdentityIQ with an individual who received what he thought was a panicked call from his daughter who was at camp, but was actually a cloned voice of his daughter created by an AI. The scammer had found the daughter’s social media post about going camping and used it to make the call more realistic. Scammers who commit AI voice fraud use AI programs to search the Internet for information about individuals and businesses, including audio and video postings on social media and elsewhere, looking for details they can use to make a more convincing call to an unwitting victim, the Scheumack notes.

The frightening part is that this is not the work of a neighbor. This is a sophisticated organization, and it is not one person doing this. Those people are not the same as the people who plug in your voice. There is someone who clones your voice. There is someone else who does the actual act of making the call. And if the scam is successful, someone comes to the victim’s home and receives the money”.

As for steps individuals can take to avoid becoming a victim of AI voice clone scams, Scheumack said they should be careful about what they post online that is available to the public and think twice before responding to an urgent call from an unknown number that is ostensibly from someone they know. Be cautious about what you post online,” he said. ‘The second step is to be aware of that in general if you get a call from an unknown number and it’s someone you care about – if you get a call from an unknown number and it’s a relative or loved one and it’s an urgent situation, that should be a red flag to you. Definitely take some time to think about it.”

Source : https://www.foxbusiness.com/technology/ai-voice-cloning-scams-on-rise-expert-warns

Recents Post

JEC Residence C5, Plumbon, Banguntapan, Modalan, Banguntapan, Kec. Banguntapan, Bantul, Daerah Istimewa Yogyakarta 55198.

info@iaesjournal.com

(+62274) 2805750

Menu

About Us

Membership & Services

IAES Journal

Conferences

Support

Help & F.A.Q

Terms & Conditions

Privacy Policy

Contact