Who's behind the Catherine Connolly deepfake video, and why?
A screenshot of the AI generated video that falsely portrays Catherine Connolly withdrawing from the presidential race. File picture
This week Ireland witnessed its first sophisticated electoral deepfake, when an AI generated video, mimicking RTÉ news, told users of Facebook and YouTube that the election had been cancelled and Heather Humphreys declared the winner, because Catherine Connolly had exited the campaign (this is not true).
To understand what happened, and how to respond, we need to also consider why people may have gone to great efforts to get this video in front of us.
There are at least four reasons why someone might carry out an act of electoral disinformation. The first is to influence the outcome of an election. Lies about how or when to vote, who is eligible, or as we saw here this week, that an election has been cancelled, can be used to stop certain voters from turning up on polling day.
We saw this tactic deployed using AI content in the US last year. During the January 2024 Democratic primary in New Hampshire, voters received robocalls impersonating Joe Biden, falsely telling them to skip the contest to preserve their vote for November. The calls were highly targeted at likely Biden supporters, aiming to keep them home and tilt the result in his opponent’s favour.
Unless we find out that the Irish deepfake video was only targeted at one or a few groups or demographics, it is hard to see that this particular video would have necessarily benefited one candidate over another.

A second motivation for spreading electoral disinformation might therefore be more likely in our case; undermining democracy by sowing confusion or distrust in the process and undermining the result.
The primary goal of Russian online interference in the 2016 US Presidential election was found to be to “sow discord” (according to the Muller Report) and “undermine public faith in the US democratic process” (according to a Senate investigation).
It was only in the final stages of the campaign the Russian operations switched explicitly supporting Trump, once they realised that a Trump presidency would align with these goals. Before that their work was mostly focused on dividing people and weakening trust in the result.
A similar pattern may have been in operation in last year’s Presidential election in Romania, which had to be cancelled and rerun when authorities uncovered large scale paid operations to interfere with the process online. The interference — and the annulment itself — both succeeded in weakening trust in that country’s democratic processes.

We’re not seeing anything at that scale here, so should entertain a third motivation and more cynical possibility. There is a market out there for realistic and convincing fake content, and no shortage of entrepreneurs ready to meet that demand.
Standing out as a provider of election interference services is a lot easier if you have some examples to point to. There is a scenario where this week’s deepfake attack in Ireland was a marketing ploy for a deepfake tech vendor.
An entire industry exists of dark arts practitioners selling their wares to both legitimate campaigns and to malign actors. Cambridge Analytica and its partner AggregateIQ were such firms, infamous for their online ads service deployed by the Brexit campaign and in US elections. For these firms, stories that talk about how dangerous and powerful their tools are are marketing opportunities.
Cambridge Analytica live tweeted the Congressional grilling of Mark Zuckerberg about its technology, recognising that the narrative could find them new customers. Likewise smaller, illicit setups, such as the Macedonia bot farms hired by Russia during the 2016 presidential election, need case studies to show prospective clients that their product is good.
The AI generated video we saw this week could well be a display item for some deepfake vendor’s shop window. That would make the many articles — including this one — talking about how sophisticated it was into marketing materials for the tech underworld.
There is also of course the strong possibility that this was intended as a laugh or, as the kids would say, a flex. We might have been witnessing an attempt to show off or score points in some underground game that we cannot see.
Parts of the web celebrate nihilistic trolling, and this likely motivated at least some of the sharing of this video. This brings to mind those people who this time last year knowingly promoted a Dublin 'Halloween Parade' that an AI slop website had hallucinated. Many of those posting about it knew it wasn’t real, but promoted it anyway because they thought it would be funny to see if anyone turned up.
We may never know the true identity of the people behind this week’s deepfake, but we do know that they went to great efforts to make this piece of content convincing. Working out why would be a first step towards understanding the stakes, and what we need to do about it for next time. Because there will be a next time.





