Wind Advisory
from SAT 12:00 PM MST until SAT 10:00 PM MST, Western Pima County including Ajo/Organ Pipe Cactus National Monument, Tohono O'odham Nation including Sells, Upper Santa Cruz River and Altar Valleys including Nogales, Tucson Metro Area including Tucson/Green Valley/Marana/Vail, South Central Pinal County including Eloy/Picacho Peak State Park, Southeast Pinal County including Kearny/Mammoth/Oracle, Upper San Pedro River Valley including Sierra Vista/Benson, Eastern Cochise County below 5000 ft including Douglas/Wilcox, Upper Gila River and Aravaipa Valleys including Clifton/Safford, White Mountains of Graham and Greenlee Counties including Hannagan Meadow, Galiuro and Pinaleno Mountains including Mount Graham, Chiricahua Mountains including Chiricahua National Monument, Dragoon/Mule/Huachuca and Santa Rita Mountains including Bisbee/Canelo Hills/Madera Canyon, Santa Catalina and Rincon Mountains including Mount Lemmon/Summerhaven, Baboquivari Mountains including Kitt Peak, Kofa, Central La Paz, Aguila Valley, Southeast Yuma County, Gila River Valley, Northwest Valley, Tonopah Desert, Gila Bend, Buckeye/Avondale, Cave Creek/New River, Deer Valley, Central Phoenix, North Phoenix/Glendale, New River Mesa, Scottsdale/Paradise Valley, Rio Verde/Salt River, East Valley, Fountain Hills/East Mesa, South Mountain/Ahwatukee, Southeast Valley/Queen Creek, Superior, Northwest Pinal County, West Pinal County, Apache Junction/Gold Canyon, Tonto Basin, Mazatzal Mountains, Pinal/Superstition Mountains, Sonoran Desert Natl Monument, San Carlos, Dripping Springs, Globe/Miami, Southeast Gila County

Voice-cloning AI scams on the rise: Arizona AG & ASU Professor send warning

The Arizona Attorney General has a warning about artificial intelligence as people are using the technology to clone voices, tricking someone into thinking a loved one is on the phone and needs money.

AG Kris Mayes says her office is getting more and more calls about this scam, and she wants to warn the public about it now so fewer people get tricked into sending money.

"I think this technology has evolved more quickly than any technology in human history," Mayes remarked.

Artificial intelligence is being used to clone voices and scammers are taking advantage by using other people's voices in phone scams.

"If you get a call from someone on the other line that sounds like your mother, you can't assume that's your mother. Unfortunately, it's somewhat awkward to ask your mother some kind of a password, 'Are you really my mom? Can you answer the following question?' but that's where we are right now," said Professor Subbarao Kambhampati with ASU's School of Computing and Augmented Intelligence.

Related

FBI warn against AI-generated deepfake content created for sextortion schemes

The FBI said victims include children and non-consenting adults.

Mayes says scammers are even taking it to the next level.

"What’s also happening is these scammers are, in some cases, using spoofing equipment that can spoof your phone number, so the combination of them being able to make it look like it’s actually your phone number in tandem with cloning your voice is making these scams and frauds even more dangerous," she said.

Kambhampati says there needs to be regulations on artificial intelligence for reasons like voice-clone scams.

Related

Artificial intelligence, like algorithms, could crack the language of cancer and Alzheimer’s, study finds

The study found that looking at cancer and neurodegenerative diseases with artificial intelligence could lead to discovering how to alleviate symptoms - or maybe even prevent the diagnosis from happening at all.

"They expect that you use them for good purposes. For example, you might want to say a story to your kid in their grandmother's voice … that's the kind of things they expect you to use it for," he said.

If you get a call, and you're unsure if it's the person they are claiming to be, Mayes says to call that person's actual phone number to verify and report scams to local law enforcement and the attorney general's office.

Just 3 seconds is all it takes

All the scammer needs is three seconds of someone talking. That's put into a website that then generates a voice.

From there, you can type whatever you want it to say.

"The surprising part is it doesn't take that much data, that much voice sample to actually train it to speak like you," Kambhampati said. "The current state of the art is even with a three-second clip of your voice, the system can imitate you … and say any text."

On a voice cloning website, it'll ask for 10 to 20 voice samples using different emotions.

"The better the voice sample, the better compelling imitation is, but you can do a pretty passable imitation with just 3 seconds of the voice," Kambhampati said.

After those voice samples are submitted, a voice is generated.

That voice can say anything a person writes in the text with a click of a button.

"Many people grew up thinking they can trust their eyes, they can trust their ears. That's no longer true," Kambhampati said.