Tap to Trade in Gate Square, Win up to 50 GT & Merch!
Click the trading widget in Gate Square content, complete a transaction, and take home 50 GT, Position Experience Vouchers, or exclusive Spring Festival merchandise.
Click the registration link to join
https://www.gate.com/questionnaire/7401
Enter Gate Square daily and click any trading pair or trading card within the content to complete a transaction. The top 10 users by trading volume will win GT, Gate merchandise boxes, position experience vouchers, and more.
The top prize: 50 GT.
![Spring Festival merchandise](https://exampl
Deepfake videos with artificial backgrounds: a new weapon of North Korean cybercriminals against the crypto industry
Cyber threats have reached a new level: the Lazarus Group, associated with North Korea and known as BlueNoroff, is actively employing advanced artificial intelligence technologies to create fake video calls with realistic backgrounds. These attacks target crypto industry professionals and demonstrate how dangerous the combination of deepfake technology and social engineering has become.
Deception Tactics: Fake Background Videos as Manipulation Tools
According to research firm Odaily, hackers initiate video calls through compromised Telegram accounts, where they use AI-generated videos with fake backgrounds, presenting them as trusted individuals of the victim. Co-founder of BTC Prague, Martin Kuhar, shared alarming details: the attackers persuade users to install malicious software disguised as a Zoom plugin supposedly to fix sound issues. Realistic videos with appropriate backgrounds significantly increase the victim’s trust, making social engineering much more effective.
Technical Arsenal: What Happens After Installing the Malware
Once the victim installs the disguised malware, cybercriminals gain full control over the device. Malicious scripts perform multi-layered infections on macOS, deploying backdoors, keyloggers, and intercepting clipboard contents. Particularly dangerous is that the malware can access encrypted wallets and users’ private keys.
Security company Huntress has identified that this methodology is closely linked to previous operations by the same group targeting cryptocurrency developers. Researchers from SlowMist confirmed that the attacks demonstrate a clear pattern of reusing tactics with adaptations for specific crypto wallets and targeted professionals.
Why Deepfake Video Is a New Threat to Identity Verification
The proliferation of technology for creating fake videos and voice cloning fundamentally changes the digital security landscape. Traditional methods of verifying identity via video calls are now unreliable, as backgrounds, facial expressions, and voices can be perfectly simulated by AI. This presents a fundamental challenge for the industry.
How to Protect Crypto Assets: Practical Recommendations
Crypto professionals must immediately strengthen their cybersecurity measures. It is essential to implement multi-factor authentication, use security keys instead of SMS codes, and most importantly, conduct out-of-band verification of any video call requests through alternative communication channels. Additionally, avoid installing software recommended by strangers, even if the video call looks convincing with a proper background. Vigilance and awareness remain the best defenses against the growing threat of deepfake attacks.