As AI technology evolves, thewire fraudMolecules are also beginning to pass AI face-changingThe AI mimicry and other tactics attempt to defraud the victims of their money. Today, this new type of scam has become a problem that needs to be confronted by many countries around the globe.
On the 18th, local time, Metro reported that Starling Bank is offering a new service for theU.K. A survey of more than 3,000 people revealed that "voice-activated" scams, which use AI to "create" the voices of a victim's friends and family from as little as three seconds of audio, are now a widespread problem locally.
The latest data show thatMore than a quarter of UK adults (28%)said they have been the target of a high-tech voice cloning scam over the past year. Even more worrisome is the fact thatNearly Half (46%)They are not even aware of the possibility of "voice-activated" fraud and are therefore more likely to be victimized if they are targeted.
Currently, voice content of more than 3 seconds is quite common on social platforms, so fraudsters can "find" family members of the person in question through various technologies and use the cloned voice to make phone calls, send voice messages or voicemails, taking advantage of the victim's mentality to cheat money.
Nearly 1 in 10 people in the survey (8%) said that in such a situation they wouldSend any required informationEven though he thought the call "looked strange".
Starling Bank is urging people not to just trust their ears but to agree a code word or phrase with their loved ones so they have a way of verifying each other's identity. Financial fraud crime in England and Wales is on the rise, jumping by 46% last year, the report said.
Not only that, but seniors in the U.S. lost $1.6 billion to scams in the year 2022 (currently about RMB 11.36 billion), many of which used AI technology to "clone" the voices of acquaintances, among other AI-generated tricks.