The crypto world is up for a new AI threat as a new deepfake tool can now bypass two-factor authentication (2FA) in crypto exchanges. This has opened up new cybersecurity challenges for crypto exchanges as they employ such 2A authentication methods to prevent fraudulent activities and protect users.
This deepfake tool created by a threat actor named ProKYC effectively forges documents, and creates fake identities and video representations. This tool used advanced deepfake technology to bypass facial recognition challenges.
As a result, this deepfake tool has been widely marketed to cybercriminals as they easily create verified user accounts on crypto exchanges using it. Hackers could potentially use it for money laundering and other fraudulent activities.
The American Association of Retired Persons (AARP) says new account fraud resulted in $5.3 billion in losses last year across various sectors. The cryptocurrency market, known for its high-value transactions and relative anonymity, presents an attractive target for cybercriminals employing such sophisticated techniques.
Cybercriminals are using the forged documents and deepfake videos generated by this tool to open fake accounts on crypto exchanges, bypassing the facial recognition authentication process. This comes at a time when crypto-related crime broke all records to reach $20.1 billion in 2022.
In the face of such challenges, crypto exchanges should change their approach towards cybersecurity and develop multi-layered security options including AI fraud detection systems. As per an IEEE study, AI-enabled fraud detection systems can accurately identify deepfake videos 94% of the time.
As Etay Maor, chief security strategist at Cato Networks, pointed out, there may be better solutions than simply tightening authentication processes for crypto exchanges. Overly restrictive biometric authentication systems can lead to an increase in false-positive alerts, potentially impacting user experience and operational efficiency.