Skip to content
EDMO Ireland

EDMO Ireland

A hub of the European Digital Media Observatory (EDMO)

  • Home
  • News
  • Factchecks
  • Media Literacy
    • Be Online Smart
    • Be Election Smart
    • Workshop in a Box
  • Analysis
  • Policy
  • Tools
  • About
    • Guiding Principles
  • Home
  • Platforms Failing on Electoral Integrity: EDMO Ireland’s Initial Findings

Platforms Failing on Electoral Integrity: EDMO Ireland’s Initial Findings

Posted on 25/10/202525/10/2025 By EDMO Admin
Elections, News

As part of EDMO Ireland’s ongoing review of platform compliance with the EU Digital Services Act (DSA) during Ireland’s 2025 Presidential Election, we are sharing our initial observations of several concerning trends in how online platforms have addressed electoral integrity.

The DSA is the EU’s framework for regulating online platforms. It requires very large platforms to identify and mitigate systemic risks, including risks to electoral integrity. As such, platforms are expected to implement effective measures, enforce their own policies, and demonstrate transparency in how they manage harmful content and behaviour during elections.

Across platforms, we continue to see a largely reactive rather than proactive approach to safeguarding electoral integrity. This reactive model relies heavily on the voluntary efforts of researchers, fact-checkers, and civil society organisations, rather than on sustained, systematic action by the platforms themselves.

Harassment and Intimidation of Public Figures

Harassment of politicians and public figures has been a persistent feature of online activity during the election period. Across multiple platforms, candidates and public representatives have been subjected to personal abuse, targeted harassment, and defamatory claims, often amplified through accounts with histories of extreme or conspiratorial content. Responses from platforms have been inconsistent and largely reactive: some acted only after reports were made, while others took no visible action at all. This pattern indicates that enforcement mechanisms remain weak, with little evidence of proactive monitoring or intervention to protect individuals engaged in democratic participation.

Facilitation of Illegal Vote-Spoiling Content

While spoiling a vote is a legitimate form of political expression, the photography and distribution of ballot papers taken inside polling stations is a violation of Irish electoral law. On polling day, several individuals posted images of their ballots online, displaying spoiled votes accompanied by messages describing candidates as “traitors” and other derogatory remarks. Notably, some of the accounts promoting these images have a history of content content violations and misinformation, raising further concerns about the adequacy of platform monitoring and enforcement.

Lack of Labelling on AI-Generated Content

Despite heightened concerns about the use of AI-generated content in elections, we note that such material is often not clearly labelled, contrary to recommendations from both the European Commission and the Irish Electoral Commission to ensure transparency. Synthetic images, videos, and voice clips that imitate real people or events can blur the line between authentic and fabricated information, increasing the risk of misleading voters or distorting public debate. Under the DSA, platforms are expected to take reasonable measures to mitigate these risks, including the clear labelling of synthetic media and the promotion of verified information sources. The absence of consistent labelling suggests that platform safeguards against AI-driven manipulation remain inadequate.

Insufficient ‘user education’ tools

Online platforms often commit to promoting reputable information sources, such as official electoral authorities, to help users access accurate information and counter the effects of misinformation. However, user information tools (i.e., prompts, banners, and in-feed notices) are frequently overly simplistic and do not appear to consistently or accurately accompany election-related content. This inconsistency raises questions about the ability of AI-driven systems to detect relevant material and about the overall effectiveness of the user education approach in mitigating misinformation risks during elections.

Amplification of disinformation via platform features

Platform features may inadvertently facilitate or amplify electoral disinformation, highlighting the systemic risks created by platform design rather than individual pieces of content. Under the DSA, platforms are required to assess and mitigate such systemic risks, yet individuals known for spreading misleading claims about candidates and the electoral process were observed to hold verified or monetised accounts, giving their content greater visibility and credibility. In some cases, banned users reappeared under new accounts, further undermining enforcement. These patterns raise serious questions about whether platforms are conducting adequate risk assessments of their own systems and features, as required by the DSA.

Failure to provide transparency on advertising

Transparency in political advertising is essential to safeguard fair and informed democratic participation. Despite EU requirements for clear labelling, disclosure, and public repositories of political ads, major platforms have failed to provide consistent transparency during the election period. In some cases, ad libraries were removed or made inaccessible, while in others, records of paid political ads were incomplete or outdated. These gaps prevent voters, researchers, and regulators from assessing who is funding political messages and how they are being targeted, undermining accountability in the digital campaign environment.

Conclusion

Ultimately, these trends raise serious questions not only for the platforms themselves but also for Irish and European regulators responsible for ensuring the effective implementation of the Digital Services Act and Irish electoral law. The patterns observed suggest that systemic risks to electoral integrity are not being adequately mitigated in practice. EDMO Ireland will publish a detailed report in the coming weeks, providing case studies and further analysis of platform behaviour during the 2025 Presidential Election.

Post navigation

❮ Previous Post: EDMO Ireland comments on AI-generated election disinformation

You may also like

Elections
EDMO Ireland comments on AI-generated election disinformation
23/10/2025
Media Literacy
EDMO Ireland and Media Literacy Ireland relaunch Be Election Smart campaign ahead of Presidential Election
17/10/2025
Elections
EDMO Ireland Report on Elon Musk’s X and Conor McGregor’s Presidential Campaign Featured in National Media
29/09/2025
Elections
Spot the Spin: A Workshop on Misinformation and Elections with EDMO Ireland
29/09/2025

Connect with EDMO Ireland

 

Follow us on Twitter: @Ireland_EDMO

Email the coordinator: fujo[at]dcu[dot].ie

Copyright © 2025 EDMO Ireland. EDMO Ireland is co-funded by the EU's Digital Europe Programme under grant agreement no. 101158756

Theme: Oceanly News by ScriptsTown