This wasn’t the first time an AI tool was used to commit fraud. Earlier this year, a Chinese man was robbed of over half a million dollars after a cybercriminal used an AI face-swapping tool to impersonate the victim’s friend, persuading him to transfer the money, according to Reuters.

CBS News also reported on the rise of audio AI tools that can clone someone’s voice, which is then used to pose as a person in distress. CNN covered a similar incident where scammers tried to trick a mother into believing that her daughter had been kidnapped using the latter’s fake voice. In another case reported by Insider, a father was told over the phone that his son was in a serious accident. 

Criminals often use such situations and send fake media to loved ones in order to extract money under the claim that it is needed for emergency assistance. This is a modern take on imposter scams, which are nothing new. In a February 2023 report, the FTC said that American citizens lost nearly $2.6 billion to this kind of fraud in 2022. However, the advent of generative AI has raised the stakes dramatically. 

Source link