Comment on The Terrifying A.I. Scam That Uses Your Loved One’s Voice: A couple in the U.S. got a call from relatives who were being held ransom. Their voices—like many others these days—had been cloned

Sal@mander.xyz ⁨9⁩ ⁨months⁩ ago

I have heard of scams like this one happening to people I know for many years now in Mexico, but without AI. In its most basic form the scammer does not need to know who they are calling, because the scam relies largely in volume and the psychology of fear.

The victim picks up, someome screems something along the lines of ‘mom/dad please help’, and then the “kidnapper” takes the phone away and says that they have taken their daughter/son hostage and that they must not hang up the phone. They do this to several numbers until someone takes the bait and freaks out, often revealing additional information (like their kid’s name) in the process.

With AI the scammer could spend the time and collect information to make the scam more believable. But I don’t think that the voice is the bottleneck for these scams. Those who have experienced this (including my mom, uncle, grandma, and acquaintances) say that in the moment of shock they really do believe they hear the voice of their family member.

The AI method makes a more sophisticated class of these attacks easier to perform, but it is still a sophisticated attack that requires gathering data, and the execution will still require some form of a performance. Or… At least that’s what I think

source
Sort:hotnewtop