Vote early, vote often? Elections in Portugal, the Philippines, Romania, Canada, and Australia have revealed a common thread of tactics to influence elections around the globe.
2024 was a signal year for democratic elections, with almost half the planet’s population voting in elections across 64 countries.
However, in April and May of 2025, five nations went to the polls, and while one outcome has largely been a rejection of right-wing politics – likely a result of the Trump administration and its perceived level of chaos – another common point is the scale and speed of election interference.
Researchers at disinformation security firm Cyabra used artificial intelligence to monitor and track coordinated campaigns across all five elections. They found alarming similarities, noting that campaigns in the five countries all seemed to come from the same playbook: “bots, fake support or outrage, viral manipulation”.
In Romania, Cyabra found that a full 16 per cent of all X accounts discussing the country’s election crisis were fake and may have had a potential reach of up to 20 million views. In Canada, 28 per cent of negative posts about eventual election winner Mark Carney came from fake accounts, and in Portugal, 58 per cent of posts praising far-right politician Chega as a defender of the country were fake as well. It was the same in the Philippines, with 32 per cent of posts supporting Rodrigo Duterte found to be fake.
And in Australia, fully 17 per cent of accounts discussing the election were likely fake. In one instance, a bot made 521 posts in such a small amount of time that no human could have been responsible.
Cyabra’s analysis was undertaken both in the lead-up to and after these elections, and in the case of Australia, the findings were released within days of everyone heading to their local voting booth and picking up their Democracy Sausage. Let’s see what happened here.
The Australian situation
Perhaps unsurprisingly, fake or bot accounts were used to target both of the major parties during the run-up to the election. Seventeen per cent of accounts taking part in conversations regarding the Australian Labor Party (ALP) and its leadership were assessed to be fake, while 18 per cent of those discussing what used to be the Coalition were fake.
On many occasions, the content generated by these fake actors outperformed those made by real users on X. Both parties were often painted as either incompetent or corrupt, while Labor was often characterised as being too “woke”.
“The presence of fake accounts was not only widespread but also strategically deployed, with the intent to manipulate narratives, influence public perception and sway voter sentiment in the lead-up to the election,” Cyabra said in its report on the Australian election campaign.
“Fake profiles accounted for approximately 17 per cent of the analysed activity, engaging in tactics designed to erode trust, amplify divisive messaging and create the illusion of consensus. Inauthentic accounts targeting the ALP relied heavily on mockery and economic fearmongering, while those targeting the LNP employed a more coordinated, pro-Labor narrative aimed at shifting voter alignment.”
One account in particular stood out, southoz, which joined X in 2023. This account generated 521 posts, many relying on AI-generated images and highly negative language, including mocking Anthony Albanese.
“This reflects a deliberate and strategic messaging approach, relying on ridicule, repetition and emotionally charged language – hallmarks of coordinated or bot-like behaviour designed to maximise visibility and provoke engagement,” Cyabra said.
According to Cyabra, this account alone was likely to have reached 726,000 users.
The bigger picture
Other reporting from Cyabra notes that while fake profiles are always active on social media, making up about 5 to 12 per cent of online conversations, during elections, this activity can peak as high as 50 per cent. Some accounts only activate for the election cycle, while others maintain activity over time to present the illusion of authenticity.
Election interference is a slow and long-term process, and while some state-based actors may favour one party over another, the ultimate goal is to create mistrust in institutions and confusion among the electorate.
“Across all five national elections, we saw coordinated disinformation tactics, just adapted to different political environments. Bot networks were strategically deployed, highly synchronised, and laser-focused,” Dan Brahmy, CEO of Cyabra, told Defence Connect.
“They used every trick in the book: fake accounts generating momentum, hijacking narratives and distorting public perception in real time. These fake accounts weren’t joining conversations – they were inventing them and then watching the ripple effects spill into the real world.
“Highly organised networks pushed strategic messaging around each campaign’s key moments. From flooding identical hashtags in the Philippines to faking outrage consensus in Romania to inflating far-right sentiment in Portugal, the goal was always the same: manufacture influence fast.
“Campaigns like these warp perception at scale. They distort public opinion and what people think public opinion is. It’s like taking the temperature of mannequins and thinking you’ve got a fever.”