Close Menu
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
  • Home
  • News
    • Politics
    • Legal & Courts
    • Tech & Big Tech
    • Campus & Education
    • Media & Culture
    • Global Free Speech
  • Opinions
    • Debates
  • Video/Live
  • Community
  • Freedom Index
  • About
    • Mission
    • Contact
    • Support
Trending

Sub-$2K ETH Price Levels Emerge As Key Long-Term Demand Zones

48 seconds ago

Get Out Humans! ‘SpaceMolt’ Is a Multiplayer Game Built Exclusively for AI Agents

3 minutes ago

You talkin’ to me? New York City official wants to turn yellow cabs into speech police.

27 minutes ago
Facebook X (Twitter) Instagram
Facebook X (Twitter) Discord Telegram
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
Market Data Newsletter
Tuesday, February 10
  • Home
  • News
    • Politics
    • Legal & Courts
    • Tech & Big Tech
    • Campus & Education
    • Media & Culture
    • Global Free Speech
  • Opinions
    • Debates
  • Video/Live
  • Community
  • Freedom Index
  • About
    • Mission
    • Contact
    • Support
FSNN | Free Speech News NetworkFSNN | Free Speech News Network
Home»Cryptocurrency & Free Speech Finance»Google Warns of AI-Powered North Korean Malware Campaign Targeting Crypto, DeFi
Cryptocurrency & Free Speech Finance

Google Warns of AI-Powered North Korean Malware Campaign Targeting Crypto, DeFi

News RoomBy News Room6 hours agoNo Comments5 Mins Read358 Views
Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email VKontakte Telegram
Google Warns of AI-Powered North Korean Malware Campaign Targeting Crypto, DeFi
Share
Facebook Twitter Pinterest Email Copy Link

Listen to the article

0:00
0:00

Key Takeaways

Playback Speed

Select a Voice

In brief

  • North Korean actors are targeting the crypto industry with phishing attacks using AI deepfakes and fake Zoom meetings, Google warned.
  • More than $2 billion in crypto was stolen by DPRK hackers in 2025.
  • Experts warn that trusted digital identities are becoming the weakest link.

Google’s security team at Mandiant has warned that North Korean hackers are incorporating artificial intelligence–generated deepfakes into fake video meetings as part of increasingly sophisticated attacks against crypto companies, according to a report released Monday.

Mandiant said it recently investigated an intrusion at a fintech company that it attributes to UNC1069, or “CryptoCore”, a threat actor linked with high confidence to North Korea. The attack used a compromised Telegram account, a spoofed Zoom meeting, and a so-called ClickFix technique to trick the victim into running malicious commands. Investigators also found evidence that AI-generated video was used to deceive the target during the fake meeting.

North Korean actor UNC1069 is targeting the crypto sector with AI-enabled social engineering, deepfakes, and 7 new malware families.

Get the details on their TTPs and tooling, as well as IOCs to detect and hunt for the activity detailed in our post 👇https://t.co/t2qIB35stt pic.twitter.com/mWhCbwQI9F

— Mandiant (part of Google Cloud) (@Mandiant) February 9, 2026

“Mandiant has observed UNC1069 employing these techniques to target both corporate entities and individuals within the cryptocurrency industry, including software firms and their developers, as well as venture capital firms and their employees or executives,” the report said.

North Korea’s crypto theft campaign

The warning comes as North Korea’s cryptocurrency thefts continue to grow in scale. In mid-December, blockchain analytics firm Chainalysis said North Korean hackers stole $2.02 billion in cryptocurrency in 2025, a 51% increase from the year before. The total amount stolen by DPRK-linked actors now stands at roughly $6.75 billion, even as the number of attacks has declined.

The findings highlight a broader shift in how state-linked cybercriminals are operating. Rather than relying on mass phishing campaigns, CryptoCore and similar groups are focusing on highly tailored attacks that exploit trust in routine digital interactions, such as calendar invites and video calls. In this way, North Korea is achieving larger thefts through fewer, more targeted incidents.

According to Mandiant, the attack began when the victim was contacted on Telegram by what appeared to be a known cryptocurrency executive whose account had already been compromised. After building rapport, the attacker sent a Calendly link for a 30-minute meeting that directed the victim to a fake Zoom call hosted on the group’s own infrastructure. During the call, the victim reported seeing what appeared to be a deepfake video of a well-known crypto CEO.

Once the meeting began, the attackers claimed there were audio problems and instructed the victim to run “troubleshooting” commands, a ClickFix technique that ultimately triggered the malware infection. Forensic analysis later identified seven distinct malware families on the victim’s system, deployed in an apparent attempt to harvest credentials, browser data and session tokens for financial theft and future impersonation.

Deepfake impersonation

Fraser Edwards, co-founder and CEO of decentralized identity firm cheqd, said the attack reflects a pattern he is seeing repeatedly against people whose jobs depend on remote meetings and rapid coordination. “The effectiveness of this approach comes from how little has to look unusual,” Edwards said.

“The sender is familiar. The meeting format is routine. There is no malware attachment or obvious exploit. Trust is leveraged before any technical defence has a chance to intervene.”

Edwards said deepfake video is typically introduced at escalation points, such as live calls, where seeing a familiar face can override doubts created by unexpected requests or technical issues. “Seeing what appears to be a real person on camera is often enough to override doubt created by an unexpected request or technical issue. The goal is not prolonged interaction, but just enough realism to move the victim to the next step,” he said.

He added that AI is now being used to support impersonation outside of live calls. “It is used to draft messages, correct tone of voice, and mirror the way someone normally communicates with colleagues or friends. That makes routine messages harder to question and reduces the chance that a recipient pauses long enough to verify the interaction,” he explained.

Edwards warned the risk will increase as AI agents are introduced into everyday communication and decision-making. “Agents can send messages, schedule calls, and act on behalf of users at machine speed. If those systems are abused or compromised, deepfake audio or video can be deployed automatically, turning impersonation from a manual effort into a scalable process,” he said.

It’s “unrealistic” to expect most users to know how to spot a deepfake, Edwards said, adding that, “The answer is not asking users to pay closer attention, but building systems that protect them by default. That means improving how authenticity is signalled and verified, so users can quickly understand whether content is real, synthetic, or unverified without relying on instinct, familiarity, or manual investigation.”

Daily Debrief Newsletter

Start every day with the top news stories right now, plus original features, a podcast, videos and more.



Read the full article here

Fact Checker

Verify the accuracy of this article using AI-powered analysis and real-time sources.

Get Your Fact Check Report

Enter your email to receive detailed fact-checking analysis

5 free reports remaining

Continue with Full Access

You've used your 5 free reports. Sign up for unlimited access!

Already have an account? Sign in here

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Telegram Copy Link
News Room
  • Website
  • Facebook
  • X (Twitter)
  • Instagram
  • LinkedIn

The FSNN News Room is the voice of our in-house journalists, editors, and researchers. We deliver timely, unbiased reporting at the crossroads of finance, cryptocurrency, and global politics, providing clear, fact-driven analysis free from agendas.

Related Articles

Cryptocurrency & Free Speech Finance

Sub-$2K ETH Price Levels Emerge As Key Long-Term Demand Zones

49 seconds ago
Cryptocurrency & Free Speech Finance

Get Out Humans! ‘SpaceMolt’ Is a Multiplayer Game Built Exclusively for AI Agents

3 minutes ago
Media & Culture

Techdirt Podcast Episode 443: The Supreme Court’s Internet Cases

32 minutes ago
Media & Culture

An Immigration Judge Finds No Legal Basis To Deport a Student Arrested for an Op-Ed

34 minutes ago
Legal & Courts

NYPD records show pattern of officer misconduct related to domestic violence, THE CITY reports

56 minutes ago
Cryptocurrency & Free Speech Finance

HOOD falls another 7% on Q4 revenue miss

59 minutes ago
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Get Out Humans! ‘SpaceMolt’ Is a Multiplayer Game Built Exclusively for AI Agents

3 minutes ago

You talkin’ to me? New York City official wants to turn yellow cabs into speech police.

27 minutes ago

Techdirt Podcast Episode 443: The Supreme Court’s Internet Cases

32 minutes ago

An Immigration Judge Finds No Legal Basis To Deport a Student Arrested for an Op-Ed

34 minutes ago
Latest Posts

NYPD records show pattern of officer misconduct related to domestic violence, THE CITY reports

56 minutes ago

HOOD falls another 7% on Q4 revenue miss

59 minutes ago

Bitcoin Top Traders Hold Tight Despite 14% Price Recovery

1 hour ago

Subscribe to News

Get the latest news and updates directly to your inbox.

At FSNN – Free Speech News Network, we deliver unfiltered reporting and in-depth analysis on the stories that matter most. From breaking headlines to global perspectives, our mission is to keep you informed, empowered, and connected.

FSNN.net is owned and operated by GlobalBoost Media
, an independent media organization dedicated to advancing transparency, free expression, and factual journalism across the digital landscape.

Facebook X (Twitter) Discord Telegram
Latest News

Sub-$2K ETH Price Levels Emerge As Key Long-Term Demand Zones

49 seconds ago

Get Out Humans! ‘SpaceMolt’ Is a Multiplayer Game Built Exclusively for AI Agents

3 minutes ago

You talkin’ to me? New York City official wants to turn yellow cabs into speech police.

27 minutes ago

Subscribe to Updates

Get the latest news and updates directly to your inbox.

© 2026 GlobalBoost Media. All Rights Reserved.
  • Privacy Policy
  • Terms of Service
  • Our Authors
  • Contact

Type above and press Enter to search. Press Esc to cancel.

🍪

Cookies

We and our selected partners wish to use cookies to collect information about you for functional purposes and statistical marketing. You may not give us your consent for certain purposes by selecting an option and you can withdraw your consent at any time via the cookie icon.

Cookie Preferences

Manage Cookies

Cookies are small text that can be used by websites to make the user experience more efficient. The law states that we may store cookies on your device if they are strictly necessary for the operation of this site. For all other types of cookies, we need your permission. This site uses various types of cookies. Some cookies are placed by third party services that appear on our pages.

Your permission applies to the following domains:

  • https://fsnn.net
Necessary
Necessary cookies help make a website usable by enabling basic functions like page navigation and access to secure areas of the website. The website cannot function properly without these cookies.
Statistic
Statistic cookies help website owners to understand how visitors interact with websites by collecting and reporting information anonymously.
Preferences
Preference cookies enable a website to remember information that changes the way the website behaves or looks, like your preferred language or the region that you are in.
Marketing
Marketing cookies are used to track visitors across websites. The intention is to display ads that are relevant and engaging for the individual user and thereby more valuable for publishers and third party advertisers.