Deepfake of Canadian PM Carney Goes Viral Day Before Federal Election
Summary
On April 27, 2025 — the day before Canada's federal election — a deepfake video depicting Prime Minister Mark Carney announcing fabricated vehicle regulations went viral on TikTok, reaching millions of views before the platform removed it. The audio was generated using Fish Audio voice-cloning software. Canada's Communications Security Establishment (CSE) had pre-warned that China, Russia, and Iran would likely deploy AI in the campaign. The Liberals won on April 28.
What Happened
A synthetic video depicting Mark Carney making false policy announcements circulated widely on TikTok on the eve of the April 28 federal election. DFRLab analysis confirmed the audio was produced using Fish Audio, a commercially available voice-cloning platform. The video was designed to appear as an official government announcement, using visual styling consistent with government communications.
The video accrued millions of views on TikTok before the platform took it down; it also spread on X. By the time removal occurred, the election-eve timing made correction campaigns largely ineffective — voters who had seen the video had already formed impressions that debunking could not reliably reverse.
Canada's CSE had published a pre-election threat assessment warning specifically that foreign state actors — including China, Russia, and Iran — would likely use AI-generated content to interfere with the 2025 campaign. The warning proved accurate in detail. Elections Canada, the independent electoral agency, lacked specific statutory authority to compel platform takedowns of AI-generated content during the election period; existing rules addressed paid political advertising but not organic viral synthetic media.
The Liberals under Carney won the April 28 election. There is no evidence that the deepfake materially altered the electoral outcome, but its reach — millions of views on a platform with disproportionate reach among younger voters — made it the most widely seen piece of electoral disinformation in Canadian history to that point.
Why It Matters
The Carney deepfake crystallized a regulatory gap that advanced democracies have not closed: platforms operating under general community standards cannot respond at the speed that AI-generated election disinformation spreads, particularly on election eve. The 24-hour period before a vote is the highest-value window for synthetic disinformation precisely because correction cycles are too slow and voters are mentally finalizing decisions.
Canada's experience also highlighted the mismatch between threat intelligence and legal authority. The CSE accurately predicted the threat; Elections Canada had no statutory tool to act on that prediction. This gap — between what intelligence agencies know and what electoral authorities can do — is a structural feature of most democratic legal frameworks, which were built for human-speed campaigning, not algorithmic viral propagation.
The case became a reference point in post-election debates about whether Canada needed a specific AI-in-elections regulatory regime, distinct from its existing digital advertising rules.