An Explainer: The Code of Practice on Disinformation, the Digital Services Act and Twitter

Twitter made headlines recently when it decided to exit from the EU’s Code of Practice on Disinformation. In response, European Commissioner Thierry Breton took to Twitter to caution the platform, tweeting, “You can run but you can’t hide,” in reference to the Digital Services Act (DSA).

Given these developments, many may have questions about the DSA and how it will interact with the Code of Practice on Disinformation. In this explainer, I will outline the purpose of the Code, its current obligations, and how it will interact with the DSA, followed by some thoughts about what the consequences may be for Twitter. 

The Code of Practice on Disinformation

The Code of Practice on Disinformation, also known as “The Code,” is a voluntary framework designed to combat disinformation. Notably, leading social media and search companies such as Google, Meta, Microsoft, TikTok, and Twitter have signed up as Code signatories, alongside smaller-scale companies and NGOs.

Initially introduced in 2018, the Code underwent further development based on feedback such as that produced by EDMO Ireland researchers at DCU’s FuJo Institute, resulting in a Strengthened Code of Practice launched in 2022. 

The Strengthened Code of Practice on Disinformation is made up of 44 Commitments which are actions that platforms will take to combat disinformation. These commitments include:

  1. Scrutiny of Ad Placements: Ensuring effective advertising policies to reduce the spread of disinformation.
  2. Political Advertising: Implementing transparency measures such as clear labelling and disclosure of payment information.
  3. Integrity of Services: Collaborating to combat manipulative behaviour and coordinated campaigns driven by bots or deep fakes.
  4. Empowering Users: Assisting users in recognising and reporting disinformation, promoting media literacy, and transparent recommender systems.
  5. Empowering Researchers: Improving data access and cooperation opportunities for researchers.
  6. Empowering the Fact-Checking Community: Supporting fact-checkers with data, resources, and integration into platform services.
  7. Transparency Centre: Joint efforts to develop and maintain the website.
  8. Permanent Taskforce: Active participation in ongoing improvements of the Code.
  9. Monitoring: Contributing to initiatives enabling monitoring and providing relevant information during crises such as the war in Ukraine. 

Ultimately, the Code asks for transparency by asking signatories to provide details such as the policies, tools and procedures they employ in combatting disinformation, to share data on policy enforcement, and to demonstrate engagement with researchers, fact-checkers, and other platforms. 

The Digital Services Act

The Digital Services Act (DSA) was officially enacted on October 19, 2022, with a phased implementation process scheduled until early 2024. It applies to all online service providers within the EU, but creates a set of specific obligations for Very Large Online Platforms (VLOPs) or Very Large Online Search Engines (VLOSEs) with over 45 million users. VLOPs and VLOSEs are required to assess and mitigate the systemic risks associated with their services and tackling disinformation is considered a necessary part of that work. 

While the exact methods for addressing systemic risks are yet to be determined, Article 45 of the DSA encourages the development of voluntary codes of conduct to aid in its implementation. These codes would allow service providers to collaborate on ”setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes”. The Code of Practice on Disinformation is specifically mentioned as an example of such a code.

In essence, if a VLOP like Twitter chooses not to participate in the Code, it could be seen as evidence of non-compliance with their obligations to assess and mitigate the risk of disinformation under the DSA. Non-compliance penalties can be severe, with potential fines of up to 6% of global turnover.

What does this mean for Twitter?

Twitter is playing a risky game when it comes to the DSA and disinformation. The transparency requirements of the Code ask platforms to make it clear how they define and tackle disinformation and while Twitter may have withdrawn from the Code they are still obligated to report on their efforts separately as part of their obligations under the DSA. 

While much speculation around Twitter’s approach to disinformation has focused on the political and personal beliefs of its CEO Elon Musk, by voluntarily participating in the Code, Twitter could have conveyed a message that it takes the issue of disinformation seriously, even if its approach differed from others. This would have demonstrated a willingness to engage in the collaborative efforts of the industry and show a commitment to transparency and responsible practices. However, Twitter’s decision to withdraw from the Code places it under increased scrutiny from regulatory authorities. 

Ultimately, the impact of the DSA will depend upon the willingness of the European Commission to enforce it. Compliance parameters for other areas of systemic risk are less well-defined, but the fact that there are established standards specifically addressing disinformation and that Twitter has chosen not to participate makes this an easy win if the European Commission wants to prove that it is not afraid to take action against platforms for non-compliance. The response from European Commission officials such as Thierry Breton and Věra Jourová, suggest that they may indeed be willing to enforce consequences when it comes to disinformation and the DSA.