A convincingly executed deepfake of Nvidia CEO Jensen Huang sent shockwaves during his GTC DC keynote, drawing much more attention than the real broadcast and pointing people toward a crypto scam. The impersonation was broadcast concurrently in a “live” stream on a copycat YouTube channel, which allegedly reached around 95,000 people at its peak—several times more viewers than the legitimate broadcast, Reuters reported.
How a fake Jensen outpaced the real keynote
As flagged by journalists, the sham stream appeared on a channel labeled “NVIDIA Live” and swiftly rose to the top of YouTube’s results for relevant searches about the keynote. Technology reporters said the phony webinar attracted nearly seven times as many viewers as the real McCoy; the authentic Nvidia broadcast was viewed live by about 12,000 onlookers, while the fraudsters soared to prominence, likely with help from bots.

CRN’s Dylan Martin took note of the stream before it disappeared, and a transcript was captured in which the fake Huang lauded Nvidia hardware for non-fungible token mining and promised consumers access to a delivery platform. A QR code appeared on the presentation screen, pulling viewers into the scheme. It was an age-old ruse concealed in next-gen generative video.
What viewers saw and why the fake livestream worked
The deepfake was barely good enough—clumsy diction and periodic lip-sync drift were sufficient to trick casual audiences or those unfamiliar with Chairman Huang’s speaking style. The swindlers employed familiar social engineering bait—a time limit, buzzy event marketing, and aggressive discoverability tactics that can push a stream into top results or recommendations.
Crypto “giveaways” and mining tie-ins continue to work because they promise rapid rewards and exploit technical FOMO. Since 2021, consumers have reportedly lost billions of dollars due to the growth of crypto-related fraud. For newcomers, social platforms are often a key first point of interaction. For example, a single scan of the QR code could redirect targets to a website that empties their wallet or a phishing site that collects private keys and personal information.

Why major platforms keep missing these scams
Detection is a moving target. YouTube has implemented policies and controls related to synthetic media, such as labels for altered content and a complaint process, and it increasingly relies on AI to detect voice or face impersonation. Google’s broader efforts, such as watermarking research and provenance metadata, can also be helpful, but these tools and controls are not uniformly enforced or sufficiently effective.
Scammers, on the other hand, move quickly. They reuse compromised networks, purchase views, mimic official thumbnails and titles, and “premiere” content around actual events to siphon traffic during brief attention windows. Delays of minutes in action can translate to thousands of views and likes, with a high conversion rate for scammers.
Spotting the tells and staying safe from scams
- Verify the channel: Nvidia’s real account is verified and has a long history; check the handle, subscriber count, and prior uploads. Beware of sparse archives or recent name changes.
- Cross-check the event: Visit Nvidia’s newsroom or corporate X account to find the official stream link. Fraudsters depend on you staying inside the platform’s search results.
- Treat QR codes as hostile: Legitimate presentations rarely push viewers to scan unmanaged QR codes for wallets, “airdrops,” or giveaways. If money is involved, assume it’s bait.
- Watch for performance quirks: Unnatural pauses, mismatched lip movements, or generic, looping talking points often surface in real time, especially in longer segments.
- Check the call to action: Urgency, promises of guaranteed returns, and requests to move conversations off-platform are classic fraud markers, regardless of production quality.
The bigger picture for AI, media trust, and safety
This episode underscores how cheaply produced synthetic media can hijack marquee moments, particularly in tech where audiences arrive primed for big announcements. Researchers and security firms have tracked a steady rise in deepfake-enabled scams across finance and celebrity impersonation, while industry groups warn that synthetic media will continue to erode default trust online unless platforms, rights holders, and advertisers coordinate more aggressively on provenance and rapid takedown.
Until then, the best defense is skepticism plus verification. If a “CEO” is urging you to scan a code and fund a wallet during a keynote, it’s almost certainly not innovation—it’s impersonation.
