Categories
News

TikTok’s Struggle with Political Advertising

Dr Shane Murphy from DCU FuJo examines key issues with TikTok and political advertising in light of the DSA and EU Code of Practice on Disinformation.

The recent European parliamentary elections were the first to take place since the EU’s Digital Services Act (DSA) came into effect in February 2024. The DSA requires large social media companies and search engines (any platform with 45+ million users) to take actions to mitigate systemic risks, including those related to disinformation around elections. Since its passing, it has become industry standard for social media companies to ensure political ads are labelled as such, and to have publicly accessible repositories of all political ads run on the platform, including basic information, such as how much the ad cost, who paid for it, and how many views it received. In the run up to the EU elections, and indeed Ireland’s local elections, social media platforms faced scrutiny for their handling of election-related content. TikTok in particular was found to have significant flaws in its policies and practices for detecting political ads and promoting disinformation, which appear to put it at odds not just with its own internal rules, but EU regulations.

On June 1st 2024, transparency activist and Digital Action founder Liz Carolan published a piece revealing significant shortcomings in TikTok’s policy enforcement and ad review process. Despite the platform’s claim to have “long prohibited political advertising, including both paid ads and creators being paid to make branded political content”, multiple candidates were found to be running ads, just one week out from election day. This represents not just a beach of the DSA guidelines relating to correct labelling of political advertisements, but also those relating to transparency and the provision of a publicly available library of political ads. By “banning” political advertisements, TikTok exempt themselves from having to provide such a database and disclose this information. However, by failing to enforce this ban, TikTok has created a way for this content to be dispersed without safeguards or accountability, preventing democratic oversight.

Carolan’s investigationrevealed that TikTok had failed to implement even the most basic protocols for identifying political ads. By searching generic keywords like “Election” and “Vote” on the platform’s ad library, she was able to find a large selection of political advertisements from Fine Gael, Sinn Fein, and Social Democrats candidates, as well as numerous advertisements uploaded by Independents. The investigation also highlighted that political advertisements likely exist which do not use these keywords, including content designed not to promote any individual candidates or parties, but to persuade viewers on specific issues, such as immigration. Identifying such advertisements on TikTok’s platform is extremely challenging because its repository includes hundreds of thousands of ads, with no clear distinction between political ads and other paid content.It was also noted that TikTok does not release information about political ads which have been removed from the platform, making it impossible to know how many times they were viewed before they were ultimately removed, or even why certain ads were removed. As a result, it is impossible to differentiate between ads removed for minor infractions and ads removed for more serious reasons, such as inciting violence, significantly hampering the ability of researchers and journalists to scrutinize election-related messaging and spending and on the platform.

Responding to Carolan’s investigation, a spokesperson for TikTok explained that “TikTok does not allow paid political advertising, and this content has been removed from our platform”. The company claimed to have “protected our platform through more than 150 elections globally”, and explained that they employ a combination of human moderators and technologies to ensure their policies are enforced, and that they regularly review and update their policies to combat disinformation. However, at the time this statement was released, Carolan explained she was still able to find political advertisements with relative ease.

On June 4th, similar findings were revealed by Global Witness, who submitted 16 ads containing disinformation to TikTok, YouTube and X. X successfully rejected all ads and suspended the associated accounts, while YouTube rejected all but two. TikTok, however, accepted all 16. The content of the ads included false claims that polling stations had been closed due to outbreaks of infectious diseases, ads suggesting it was possible to vote via text, false information about legal voting ages, and ads inciting hatred against migrant voters. Importantly, all ads were withdrawn by Global Witness before going live. No members of the public were ever unknowingly exposed to this disinformation. However, in order to test TikTok’s claim that they do not allow advertisements that even reference elections, the researchers successfully shared an ad that simply read “It’s an Election Year”, which received 12,000 impressions in its first hour. Following their investigation Global Witness submitted a complaint to the EU regulator, sharing their evidence of TikTok’s inability to enforce its own policies. In response to this investigation, TikTok alleged that this lapse in policy enforcement was a result of human error and that the responsible moderator was being retrained. Nevertheless, the combined findings of Carolan’s and Global Witnesses’ investigations point to much deeper issues in TikTok’s moderation infrastructure. At the same time, YouTube and X’s relative success shows that such basic enforcement is indeed possible.

The DSA also mandates that these companies and platforms have the capabilities to react quickly to any attempts to manipulate their services and circumvent their policies, in efforts to undermine electoral processes. Such basic failures on a platform with 134m monthly users in Europe could expose the public to disinformation that would significantly impact their ability to make informed choices, at a time of significant global challenges around highly contested issues, such as conflict, climate change and immigration. This is a particular concern for younger voters who receive a proportionately higher volume of their news from online sources, including TikTok. Carolan emphasizes this in her investigation, explaining: “democracy depends on campaigning happening out in the open, especially when it comes to people spending money to influence elections”. Ensuring the integrity of elections and maintaining public trust, requires tracking political advertisements, their funding sources, and preventing disinformation before it has a chance to circulate.

Following a formal complaint lodged by Carolan to the Irish media regulator Coimisiún na Meán, TikTok could now face a fine potentially amounting to 6% of the company’s annual turnover, which if upheld, would amount to hundreds of millions of euros. It is now up to Coimisiún na Meán to assess the allegations, and determine whether to move forward with enforcement actions. This process involves an initial review, which may be escalated to an investigation team. The process through which Carolan was able to lodge this complaint is a result of specific provisions in the DSA which grant individual citizens new rights, empowering them to hold tech companies accountable. Speaking on these new rights, Carolan explained “This is a brand new mechanism, and I look forward to seeing how this feeds into accountability for big tech platforms in their responsibilities to our democracy in Ireland”. The outcomes of this regulatory action could potentially set new precedents for how digital platforms operate in politically sensitive contexts.

In the meantime, FuJo and EDMO Ireland will be working with colleagues across the EDMO network to investigate how platforms are complying with their political advertising commitments under the EU Code of Practice on Disinformation. There is an expectation that this currently voluntary code will become a binding “code of conduct” under the DSA.