Artificial Intelligence (AI)-generated voice cloning has been on the rise and cyber criminals across India have been using it to extort money. Delhi, the national capital, alone registered 685 cybercrime cases in 2022 as against 345 in 2021 and 166 in 2020, as per the National Crime Records Bureau (NCRB). It's a classic and common scam, and like many scams it relies on a scary, urgent scenario to override the victim's common sense and make them more likely to send money. Now, scammers are reportedly experimenting with a way to further heighten that panic by playing a simulated recording of (your) voice. The ability to create audio deepfakes of people's voices using machine learning and just minutes of them speaking has become relatively cheap and easy to acquire technology. There are myriad websites that will let you make voice clones. Some will let you use a variety of celebrity voices to say anything they want, while others will let you upload a new person’s voice to create a voice clone of anyone you have a recording of. Scammers have figured out that they can use this to clone the voices of regular people. Suddenly your relative isn’t talking to someone who sounds like a complete stranger, they are hearing your own voice. This makes the scam much more concerning.
11 мар 2024