Introduction: Constitutional Commitments and the Digital Transformation of Expression
The European Union has progressively constructed a far-reaching framework to counter hate speech both offline and online, combining legislative initiatives, platform governance mechanisms, and sustained investment in prevention and monitoring. Anchored in the Union’s constitutional values and informed by the profound structural changes brought about by digitalisation, this framework rests on three interdependent dimensions: legal and policy foundations, monitoring and implementation, and preventive and capacity-building measures. What follows is a systematic exposition of these foundations, the historical and political trajectory that culminated in the Digital Services Act, and the institutional architecture designed to govern online expression within the internal market.
Legal and Normative Foundations of EU Hate-Speech Regulation
The European Union’s efforts to combat hate speech are grounded in its foundational constitutional values. Article 2 of the Treaty on European Union (TEU) affirms that the Union is built upon respect for human dignity, freedom, democracy, equality, and human rights. Complementing this, the Charter of Fundamental Rights of the European Union safeguards freedom of expression under Article 11, yet expressly permits this right to be limited under the conditions set out in Article 52 paragraph 1 of the Charter, including for purposes such as the prevention of crime and the protection of the rights and freedoms of others. Read together, these provisions create a coherent constitutional framework that not only safeguards freedom of expression but also authorizes proportionate restrictions on forms of expression that undermine the Union’s essential values. This framework provides the European Union with a clear legal and normative foundation for adopting measures to regulate and counter hate speech within its jurisdiction.
Early initiatives of the European Union in this field concentrated on cooperation among the Member States. Council Joint Action 96/443/JHA of 15 July 1996 concerning action to combat racism and xenophobia encouraged the adoption of national criminal law measures and sought to strengthen judicial cooperation so as to prevent perpetrators from exploiting divergent legal frameworks. This instrument was subsequently followed by Council Framework Decision 2008/913/JHA of 28 November 2008 on combating certain forms and expressions of racism and xenophobia by means of criminal law, which requires Member States to criminalise public incitement to violence or hatred directed against groups or individuals defined by race, colour, religion, descent, or national or ethnic origin, as well as the public condoning, denial, or gross trivialisation of genocide, crimes against humanity, and war crimes.
However, the EU’s direct legislative competence remains limited: the criminal-law competence under Article 83 TFEU (Treaty on the Functioning of the European Union) currently covers only specific “EU crimes,” and extending it to hate speech more broadly requires further agreement by Member States.
The Emergence of the Digital Services Act: Regulatory Logic and Legal Basis
The Digital Services Act, formally Regulation (EU) 2022/2065, represents the European Union’s most significant regulatory intervention in the digital environment and establishes a comprehensive framework for addressing illegal online content, including forms of hate speech that are prohibited under national or Union law. Unlike instruments based on the criminal law competence of the European Union under Article 83 of the TFEU, the Digital Services Act (DSA) relies on the internal market legal basis of Article 114 TFEU, which enables the Union to regulate digital services in order to ensure the proper functioning of the internal market.
Through this internal market rationale the DSA introduces layered obligations for intermediary services, online platforms and very large online platforms and search engines, requiring among other things the establishment of notice and action mechanisms, transparency requirements, cooperation with trusted flaggers, and comprehensive assessments and mitigation of systemic risks relating to the dissemination of illegal content including hate speech.
Why the DSA Was Necessary: Economic, Societal, and Democratic Drivers
The contemporary digital economy is characterised by a profound concentration of economic power in a small number of multinational technology corporations, which collectively command a significant share of global market valuation and exert considerable influence over information flows, commercial interactions and public discourse. Online commercial activity has expanded at an accelerated pace in recent years, with the Union witnessing substantial growth in electronic commerce and a corresponding rise in digital turnover, developments that have intensified concerns regarding the capacity of dominant platforms to distort competition and shape market outcomes.
Although online platforms facilitate the exercise of fundamental rights by enabling access to information, fostering communicative exchange and supporting vibrant digital public spheres, they simultaneously function as vectors for the dissemination of disinformation, illegal content and various forms of hateful expression, thereby generating significant societal and regulatory challenges.
From the perspective of Union law, the magnitude of these risks consequently necessitates a regulatory framework capable of addressing the structural vulnerabilities of the digital environment while upholding the constitutional commitments of the Union, including the protection of fundamental rights and the preservation of a competitive, transparent and accountable internal market.
From the E-Commerce Directive to a New Regulatory Order
The DSA represents a fundamental recalibration of the European Union’s regulatory strategy toward the digital environment, reflecting the Union’s recognition that the architecture of online intermediary services has become integral to the functioning of the internal market and to the protection of fundamental rights in contemporary society. For many years the European Union operated within an established legislative order governing electronic commerce, most prominently the E Commerce Directive adopted at the turn of the millennium, yet the sweeping and increasingly intricate transformation of the digital environment over the past decade exposed significant structural limitations in that earlier framework, which relied on minimal harmonisation and a largely reactive posture toward the growing array of online harms. By the late 2010s, it had become increasingly apparent that the expansion of digital services, the rise of algorithmically-driven platforms, and the consolidation of market power among a small number of dominant actors required a renewed and comprehensive regulatory response grounded in the competences conferred by the Treaties.
The Commission therefore proposed the DSA in 2020 as part of a broader legislative package that also included the Digital Markets Act, both instruments constituting central pillars of the Union’s evolving “Digital Strategy” and its ambition to assert strategic autonomy in the governance of digital infrastructures. In view of the previously noted constraints inherent in the Union’s criminal law competence under Article 83 TFEU, whose exhaustive list of harmonisable offences prevents it from serving as the basis for a comprehensive regulatory regime governing online hate speech, the internal market competence supplies a more coherent and adaptable jurisdictional foundation for addressing the systemic challenges generated by digital intermediaries, irrespective of Member State divergences in the definition or prosecution of hate speech. The Digital Services Act thus constitutes an explicit regulatory shift away from a model centred on ex post liability rules toward a framework that imposes ex ante obligations designed to shape platform governance practices, enhance systemic accountability, and structure the responsibilities of intermediaries in a manner consistent with the Union’s constitutional commitments.
From its inception, the DSA was articulated not as a narrowly circumscribed recalibration of the existing acquis but as a legislative intervention of substantial conceptual and structural ambition, one expressly intended to supersede the core architecture of the E Commerce Directive and to inaugurate a comprehensive and technologically neutral regulatory order capable of governing the broad and continually evolving constellation of online intermediary services. In doing so, the Union sought to construct a harmonised framework that would operate across diverse platform configurations and intermediary functions, thereby imposing a coherent set of obligations that transcends specific technical modalities and reflects a systemic reorientation of platform regulation within the internal market.
Broader Union Initiatives Shaping the DSA’s Legislative Context
The emergence of the DSA cannot be understood in isolation from the wider institutional and political constellation that progressively steered the Union toward a more coherent and anticipatory approach to digital governance. Among the earliest manifestations of this trajectory was the establishment of the EU Observatory on the Online Platform Economy in 2018, an initiative that signalled the Union’s recognition of the need for dedicated, continuous, and methodologically robust monitoring of platform dynamics. Mandated to assess market developments, detect systemic risks, and inform future regulatory interventions, the Observatory served as an institutional mechanism for consolidating evidence about the structural transformations underway within the platform economy.
Political impetus intensified with the articulation of the 2019 Political Guidelines of the President of the European Commission, which elevated digital transformation to the status of a defining strategic priority for the Union. In these guidelines, the creation of a secure, trustworthy, and rights-respecting digital sphere was presented not merely as an economic imperative but as a foundational requirement for sustaining the Union’s democratic legitimacy and societal resilience in an increasingly data-driven environment.
This evolving agenda was further reinforced in the aftermath of the COVID-19 pandemic, when the Union’s Recovery and Resilience Facility, embedded within the broader NextGenerationEU programme, earmarked unprecedented financial resources for digitalisation across Member States. The prioritisation of digital investment within the Union’s post-pandemic reconstruction strategy underscored the political and economic salience of establishing a clear, harmonised, and future-proof regulatory environment for digital infrastructures and services.
Taken collectively, these initiatives fostered a growing institutional awareness that the Union required a regulatory instrument capable of addressing not only the competition-related and economic consequences of concentrated platform power but also the broader societal, democratic, and fundamental rights implications of digital intermediation. This recognition was crystallised in the 2020 European Democracy Action Plan, which identified the proliferation of disinformation, the manipulation of information ecosystems, and the widespread dissemination of illegal hate speech as systemic challenges necessitating a more coherent and integrated regulatory response.
Legislative Evolution of the Digital Services Act
The legislative pathway culminating in the adoption of the DSA evidences a deliberate and methodologically sequenced effort by the European Union to recalibrate its regulatory posture toward the digital environment. This trajectory reflects the Union’s attempt to respond to accelerating technological developments, increasing platform concentration, and the emerging awareness that existing legal instruments were no longer adequate to govern the complexity of contemporary digital intermediation. The formal legislative process commenced on 15 December 2020, when the European Commission presented its proposal for the DSA as part of a twin-pillar reform package alongside the Digital Markets Act (DMA). This moment marked the regulation’s entry into the Union’s legislative machinery and signalled the Commission’s intention to establish a harmonised, forward-looking regulatory framework for online intermediaries.
Over the course of 2021 and early 2022, the European Parliament and the Council engaged in sustained trilogue negotiations, culminating in a provisional political agreement on 23 April 2022. This agreement constituted a significant juncture of institutional alignment, reflecting consensus on the necessity of a comprehensive reform of the Union’s digital governance model. Following this political settlement, the European Parliament adopted the DSA on 5 July 2022, and the Council gave its final approval on 4 October 2022, thereby enabling its subsequent publication in the Official Journal of the European Union and concluding the formal legislative phase.
The Regulation entered into force on 16 November 2022, after being published in the EU Official Journal on 27 October 2022, thereby initiating a staggered implementation process. This structuring ensured that the obligations of differing categories of digital intermediaries would become applicable in a manner aligned with their operational burdens and risk profiles. The operationalisation of the DSA advanced decisively on 25 April 2023, when the European Commission issued its first set of formal designation decisions. In this initial round, the Commission identified 17 Very Large Online Platforms VLOPs and 2 Very Large Online Search Engines VLOSEs each exceeding the statutory threshold of 45 million monthly active recipients in the Union on the basis of user metrics that providers were required to disclose by 17 February 2023. The Commission also emphasised the construction of a new supervisory architecture signalling its intention to exercise direct oversight over VLOPs and VLOSEs while national Digital Services Coordinators required to be operational by 17 February 2024 will supervise all other intermediary services. Complementing this institutional structure the Commission announced the role of the newly established European Centre for Algorithmic Transparency (ECAT) which supports risk assessments and compliance evaluations concerning algorithmic systems deployed by designated platforms. Furthermore, the Commission launched a call for evidence concerning the DSA’s provisions on data access for vetted researchers, seeking input on the development of a delegated act establishing detailed procedural and technical parameters. As of 2025, this initiative has progressed significantly. The Commission adopted the Delegated Regulation on data access for researchers in July 2025, formalising the framework governing requests, access procedures, safeguards, and provider obligations.
Conclusion: The Consolidation of a New European Digital Constitutionalism
The transformation of the European Union’s regulatory landscape demonstrates the breadth and ambition of the initiatives deployed to confront the challenges posed by online hate speech and to govern the rapidly evolving digital sphere. Over several decades, the Union has advanced a succession of interconnected initiatives, ranging from early criminal law instruments directed against racist and xenophobic expression to coordinated policy strategies, monitoring mechanisms, codes of conduct, transparency programmes, and substantial legislative reforms. Considered together, these initiatives reveal a sustained institutional effort to shape an online environment that reflects the values of the Union and supports the functioning of a democratic and inclusive society.
The DSA now stands as the most ambitious expression of this trajectory. Rather than accepting the reactive limitations embedded in the earlier regulatory model, the Union has repositioned responsibility, transparency, and systemic risk awareness at the centre of platform governance. The DSA synthesises a wide array of initiatives across the domains of market regulation, protection of fundamental rights, technological oversight, algorithmic transparency, and cross border enforcement cooperation. Its purpose is not only to address the circulation of illegal hate speech but also to construct a stable and coherent environment in which users, platforms, regulators, and democratic institutions can interact with greater clarity and mutual accountability.
This consolidation represents a significant moment in the emergence of a European digital constitutionalism. It underscores the Union’s determination to align the governance of digital infrastructures with its foundational principles while ensuring that the many initiatives in legislation, supervision, research, and prevention operate in a coherent and mutually reinforcing manner. The cumulative result is the gradual formation of a more resilient and rights respecting digital order, one capable of sustaining the vitality of the public sphere and upholding the constitutional values at the core of the European project.