Contacts
Members area
Close

Contacts

Registered office:

1065 Budapest, Bajcsy-Zsilinszky út 31. I/11.

info@ceuli.org

Integration of the EU Code of Practice on Disinformation into the Digital Services Act

ChatGPT Image 2025. ápr. 24. 09_27_25

In response to growing concerns over the spread of online disinformation, the European Commission (EC) and the European Board for Digital Services (EBDS) officially endorsed the integration of the Code of Practice on Disinformation into the Digital Services Act (DSA) on 13 February 2025.[1] This integration marks a pivotal development in the EU’s evolving digital governance landscape. Although the Code itself and participation in a code remains voluntary, its new status as a recognized Code of Conduct under Article 45 of the DSA means it will now function as a benchmark for compliance, especially for Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs).[2] Once a platform opts in, it is expected to uphold the commitments in the applicable code, which, while not legally binding, become subject to regulatory scrutiny and independent audits under the DSA framework.[3]

Such codes under the DSA offer several key benefits, particularly in operationalising its broad provisions. They play a critical role by clarifying systemic risks, providing concrete guidance on what companies should address, thereby supporting the consistent implementation of the DSA’s broad or open-ended provisions.[4]

Background and Evolution of the Code

The Code of Practice on Disinformation originated in 2018, marking the first time that major online platforms voluntarily agreed to self-regulatory standards to combat the spread of disinformation.[5] Original signatories included Facebook, Google, X (formerly: Twitter), and Mozilla, followed by Microsoft in 2019 and TikTok in 2020.[6] After receiving guidance from the Commission, the Code underwent a substantial revision in 2022, resulting in a more robust and comprehensive framework.[7] As of early 2025, 42 signatories support the Code, including tech giants and advertising platforms like Adobe, TikTok, and Vimeo,[8] with X (formerly: Twitter) having withdrawn since then.[9]

With the integration into the DSA under Article 45, the Disinformation Code becomes the second voluntary code of conduct incorporated into the DSA framework, following the Code of Conduct on Countering Illegal Hate Speech Online.[10] This shift signifies the EU’s growing reliance on co-regulatory models that mobilize firms by blending voluntary corporate commitments with binding legal frameworks in recent years.[11]

Key Commitments and Measures

The Disinformation Code outlines 44 commitments and 128 specific measures, addressing various dimensions of disinformation while aiming to uphold freedom of expression and transparency. These commitments fall into four interlinked areas:[12]

  1. Demonetisation:
    1. Preventing the monetisation of harmful content.
    1. Ensuring advertisements are not placed next to or disseminate disinformation.
  2. Transparency in political advertising:
    1. Clearer labelling of political ads.
    1. Disclosure of sponsors, spending, and ad display duration.
  3. Integrity of services:
    1. Mitigation of manipulative behaviours, such as bot-driven amplification, fake accounts, impersonation, and malicious deep fakes.
    1. Ongoing review of tactics used by malicious actors.
  4. Empowerment of users, researchers, and fact-checkers:
    1. Better tools for identifying disinformation.
    1. Enhanced access to data for researchers.
    1. Expanded fact-checking networks across EU member states.

While voluntary, these commitments are broad in scope and designed to serve as risk mitigation strategies under the DSA – particularly relevant for VLOPs and VLOSEs, who face stricter obligations under the Act.[13]

Notably, recent announcements by X and Meta regarding a shift toward community-based content moderation[14] may prove difficult to implement within the EU context, as the Code’s commitments – particularly those tied to the integrity of services – require platforms to take active, structured measures against disinformation. These include systemic interventions, transparency, and regular audits,[15] which go beyond merely relying on user-driven moderation models.

Regulatory Significance and Enforcement

Although not legally binding, the Code’s integration into the DSA carries regulatory weight. Under Article 35(1)(h), compliance with Article 45 codes – like the Disinformation Code – is considered a valid way to demonstrate risk mitigation.[16] Furthermore, Article 37 mandates independent audits for VLOPs/VLOSEs, during which adherence to the Code’s commitments will be reviewed.[17]

Recital 104 adds further pressure by suggesting that refusal to participate in voluntary codes, without proper justification, may be considered when evaluating broader non-compliance with the DSA. Thus, while signatories are not legally obligated to comply with the Code, non-compliance could indirectly affect their regulatory standing.[18] The auditability of the Code’s commitments begins on 1 July 2025, aligning with DSA compliance schedules. This timing ensures synchronization between voluntary commitment monitoring and mandatory legal assessments.

Implementation, Recommendations and Limitations

To support effective implementation, the EC and the EBDS have issued a set of non-binding recommendations for signatories, which fall outside the formal scope of the Code but are intended to strengthen its practical application. Key areas of focus include:[19]

  • Finalising the Rapid Response System to ensure timely interventions during national elections and crisis situations.
  • Enhancing collaboration within the Code’s permanent Taskforce to ensure follow-up on key commitments, particularly in areas related to the prevention of disinformation.
  • Improving data reporting to close existing transparency gaps and enable the development of measurable indicators to assess compliance and progress.

These recommendations aim to foster a dynamic and responsive governance model, particularly valuable in the rapidly evolving information ecosystem. However, the approach is not without criticism or limitations. The Code remains voluntary, and its lack of direct legal enforceability may weaken its practical impact. Additionally, fact-checking obligations have drawn industry criticism for being disproportionate or impractical. Moreover, analysis shows that the information and responses submitted by platforms frequently lacked specificity and often failed to fully meet the Code’s requirements.[20] Although this data is from 2023 – before the Code was formally recognized under the DSA – it still highlights persistent gaps in transparency and accountability.

Ultimately, the effectiveness of the Code depends on the European Commission’s enforcement posture and the extent to which VLOPs and VLOSEs see reputational and legal value in active compliance.

Future Directions and Emerging Challenges

The integration of the Code of Practice on Disinformation into the Digital Services Act (DSA) marks a milestone in the EU’s digital governance efforts. By embedding what was once a voluntary initiative into a structured regulatory framework, the EU has established the Code as a key benchmark for compliance among major online platforms. However, the success of this integration will ultimately depend on several factors: the effectiveness of implementation, the quality and independence of audits, and the political will to hold platforms accountable for their commitments in addressing disinformation. This also raises questions about the viability of recent moderation models – such as community-driven fact-checking initiatives by Meta and X – within the EU, where platforms are expected to implement active and auditable measures under the Code.

Looking ahead, within the DSA framework, the European Commission is expected to promote two new voluntary codes. Under Article 46, a Code of Conduct for Online Advertising is expected to enhance transparency and fairness across the digital advertising ecosystem. Simultaneously, Article 47 envisions a Code of Conduct for Accessibility, aimed at improving access to online services for persons with disabilities. Both were intended to be implemented by 18 August 2025, yet as of now, there have been no public updates regarding their progress.[21]

These pending codes will be crucial in testing the broader promise of the DSA’s co-regulatory framework. Their development and adoption will help determine whether voluntary standards – when paired with structured oversight – can meaningfully enhance digital accountability across the EU.


[1] European Commission. “Commission Endorses the Integration of the Voluntary Code of Practice on Disinformation into the Digital Services Act.” European Commission, 13 Feb. 2025. https://ec.europa.eu/commission/presscorner/detail/en/ip_25_505, hereinafter: EC Endorsement, 2025.

[2] EC Endorsement, 2025.

[3] Bertelli, Giacomo, Ambra Pacitti, and Erika De Santis. “Strengthening the EU’s Digital Landscape – Integration of the Revised Code of Conduct on Hate Speech and the Code of Practice on Disinformation into the DSA.” MediaLaws, 31 Mar. 2025, https://www.medialaws.eu/strengthening-the-eus-digital-landscape-integration-of-the-revised-code-of-conduct-on-hate-speech-and-the-code-of-practice-on-disinformation-into-the-dsa/.

[4] Vander Maelen, Carl, and Rachel Griffin. “Twitter’s Retreat from the Code of Practice on Disinformation Raises a Crucial Question: Are DSA Codes of Conduct Really Voluntary?” DSA Observatory, 12 June 2023, https://dsa-observatory.eu/2023/06/12/twitters-retreat-from-the-code-of-practice-on-disinformation-raises-a-crucial-question-are-dsa-codes-of-conduct-really-voluntary/.

[5] King, Stephen, Eoghan O’Keeffe, and Rosalyn English. “EU’s Code of Conduct on Disinformation Integrated into DSA.” A&L Goodbody, 24 Mar. 2025, https://www.techlaw.ie/2025/03/articles/content-regulation/eus-code-of-conduct-on-disinformation-integrated-into-dsa/.

[6] Galantino, Sharon. “How Will the EU Digital Services Act Affect the Regulation of Disinformation?” SCRIPTed, vol. 20, no. 1, Feb. 2023, pp. 89–110, https://script-ed.org/article/how-will-the-eu-digital-services-act-affect-the-regulation-of-disinformation/.

[7] EC Endorsement, 2025.

[8] King–O’Keeffe–English, 2025.

[9] Gillett, Francesca. “Twitter Pulls Out of Voluntary EU Disinformation Code.” BBC News, 27 May 2023, https://www.bbc.com/news/world-europe-65733969.

[10] European Commission. “The Code of conduct on countering illegal hate speech online +.” European Commission, 20 Jan. 2025, https://digital-strategy.ec.europa.eu/en/library/code-conduct-countering-illegal-hate-speech-online; King–O’Keeffe–English, 2025.

[11] Mündges, Stephan, and Kirsty Park. “But Did They Really? Platforms’ Compliance with the Code of Practice on Disinformation in Review.” Internet Policy Review, vol. 13, no. 3, 25 July 2024, https://policyreview.info/articles/analysis/platforms-compliance-code-of-practice-on-disinformation-review.

[12] King–O’Keeffe–English, 2025.

[13] EC Endorsement, 2025.

[14] McMahon, Liv, Zoe Kleinman, and Courtney Subramanian. “Facebook and Instagram Get Rid of Fact Checkers.” BBC News, 7 Jan. 2025, https://www.bbc.com/news/articles/cly74mpy8klo; Dedezade, Esat. „Meta to Launch X-Like Community Notes in Major Fact-Checking Shift.” Forbes, 13 Mar. 2025, https://www.forbes.com/sites/esatdedezade/2025/03/13/meta-to-launch-x-like-community-notes-in-major-moderation-shift/.

[15] King–O’Keeffe–English, 2025.

[16] Ibid.

[17] EC Endorsement, 2025.

[18] King–O’Keeffe–English, 2025.

[19] Ibid.

[20] Mündges–Park, 2024.

[21] Bertelli–Pacitti–De Santis, 2025.

Leave a Comment

Az e-mail címet nem tesszük közzé. A kötelező mezőket * karakterrel jelöltük