Introduction: Why the DSA Matters
The Digital Services Act (DSA) is a landmark piece of European Union legislation that fundamentally reshapes how online platforms and digital services operate within the EU’s internal market. Adopted in October 2022 and progressively enforced beginning in 2023, the DSA aims to make the online space safer, more transparent, accountable, and respectful of fundamental rights. Unlike earlier regulatory frameworks, which were fragmented or loosely referenced, the DSA establishes a comprehensive, horizontal set of rules governing online intermediaries of all sizes — from small online marketplaces to the world’s largest social networks and search engines.
At its core, the DSA reflects the EU’s commitment to an internet ecosystem where rights are protected, harm is mitigated, and power imbalances between platforms and users are addressed. It is part of a broader European digital agenda that also includes the Digital Markets Act (DMA), the AI Act, and other regulatory instruments aimed at ensuring that the digital transformation aligns with European values like human dignity, privacy, and democracy.
Chapter 1 — Origins and Framework of the DSA
1.1 Historical Context
Before the DSA, Europe’s digital legal framework was largely shaped by the Electronic Commerce Directive of 2000 — a foundational law that set out basic duties and protections for online intermediaries but lacked the depth needed for today’s complex digital environment. The digital landscape of the early 21st century has since been transformed by social networks, mobile apps, algorithmic recommendation systems, and platforms with billions of users worldwide — developments that outpaced the old rules.
In response, the European Commission proposed a modernized regulatory framework in late 2020, culminating in the European Parliament and Council agreeing to the DSA in 2022. The regulation (officially Regulation (EU) 2022/2065) entered into force that same year and was progressively phased in.
1.2 Purpose and Scope
The DSA’s stated goals are ambitious and multifaceted:
- Protect users from illegal content, goods, or services online.
- Counter systemic risks posed by large platforms, including the spread of harmful disinformation or manipulation.
- Enhance transparency and accountability of online platforms, particularly regarding content moderation, recommender systems, advertising, and data access for researchers.
- Empower users with greater control over their digital experience, including choices about algorithmic recommendations, reporting mechanisms, and meaningful appeal options.
Unlike many national laws, the DSA is a regulation — meaning it applies uniformly across all EU member states without the need for separate domestic legislation. Its obligations cover a wide range of “intermediary services,” including:
- Social networks and messaging platforms
- Search engines
- Online marketplaces and app stores
- Content‑sharing services
- Hosting and cloud services offered in the EU
This broad scope means the regulation touches daily digital life for hundreds of millions of Europeans and shapes the obligations of companies around the world that operate within the EU market.
Chapter 2 — How the DSA Works: Rules, Obligations, and Structures
2.1 Core Legal Architecture
One of the defining features of the DSA is its risk‑based approach. Obligations are proportionate and tailored to the size, role, and impact of the service:
- Most online platforms must comply with baseline rules on illegal content, reporting, and basic transparency.
- Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) — defined as those with at least 45 million monthly active recipients in the EU — are subject to the strictest obligations due to their systemic impact.
This graduated framework ensures that rules are not one‑size‑fits‑all but instead reflect the potential societal risks associated with different types of services.
2.2 Key Obligations for All Platforms
Across the board, platforms must:
- Promptly act against illegal content or goods once notified, with clear mechanisms for reporting and action.
- Provide transparency on content moderation policies and enforcement outcomes.
- Ensure accountability in advertising, including banning targeted ads based on sensitive data categories (like religion or sexual orientation) and imposing stricter limits on ads directed at minors.
Platforms must also establish user‑friendly mechanisms for reporting suspected illegal content and for users to appeal content moderation decisions. The regulation’s emphasis on due process and fairness is a notable departure from the often opaque moderation ecosystems of the past.
2.3 Obligations for VLOPs and VLOSEs
For very large services, additional requirements include:
- Systemic risk assessments: Platforms must conduct, document, and address risks like misinformation, harms to minors, and societal manipulation.
- Algorithmic transparency: Users must be able to opt out of algorithmic recommendations and choose chronological or other non‑personalized feeds.
- Independent audits: External assessments of compliance with major DSA provisions are required to verify internal findings — a step towards verifiable accountability.
- Comprehensive advertising disclosures: Large platforms must provide unprecedented transparency about who advertises, why users see specific ads, and how targeting works.
These obligations, combined, seek to address not just individual wrongs but systemic risks that large platforms can pose to democratic processes, public health, and basic human rights online.
2.4 Reporting, Enforcement, and Oversight
The DSA establishes a multi‑layered enforcement structure:
- National Digital Services Coordinators in each EU member state supervise and enforce DSA compliance for many platforms and act as local points of contact for users.
- European Commission oversight is primary for VLOPs and VLOSEs, given their cross‑border reach and potential impact.
Enforcement powers include investigations, requests for internal documentation, on‑site inspections, and penalties — including fines of up to 6% of global annual turnover in severe cases of non‑compliance.
Chapter 3 — What the DSA Means for Users, Businesses, and Society
3.1 For Users: Rights in the Digital Space
One of the DSA’s most visible impacts is on online user experience:
- Clearer explanations when content is removed, restricted, or accounts are suspended.
- Accessible appeal channels, either internally on the platform or via independent dispute resolution bodies outside traditional court systems.
- More control over recommended content, including the option to turn off algorithmic feeds on large platforms.
- Greater ad transparency and a ban on sensitive data‑based targeting, especially to minors.
Overall, these measures are designed to rebalance power in favor of users — giving individuals more insight into how platforms operate and more ways to challenge decisions affecting their digital presence.
3.2 For Smaller Platforms and Startups
While the DSA applies broadly, it does not impose the same heavy compliance requirements on micro and small enterprises. These entities are temporarily exempt from certain rules and obligations to avoid stifling innovation, acknowledging that burdensome obligations could disadvantage emerging businesses relative to established players.
Nevertheless, even smaller services must meet key responsibilities, such as responding to illegal content reports and providing basic transparency on their policies. This ensures a baseline of digital rights protection across the entire ecosystem.
3.3 For Large Tech Platforms
For major digital giants like Facebook, Instagram, TikTok, Google, X, Amazon, and others, the DSA represents a sea change. Many platforms have had to:
- Reengineer recommendation systems
- Enhance transparency databases and reporting practices
- Implement new age‑friendly features and compliance mechanisms
- Rebalance personalization with user control
The enforcement environment is active and evolving, with investigations and compliance checks continuing past initial implementation.
Chapter 4 — Enforcement, Compliance Challenges, and Legal Actions (2024–2026)
The years 2024, 2025, and early 2026 have been particularly significant for the DSA’s real‑world application:
4.1 Expansion of Supervision in Member States
National enforcement capacities have grown. For example, in the Netherlands, both the Autoriteit Persoonsgegevens (AP) and ACM (Netherlands Authority for Consumers & Markets) have been authorised since February 2025 to supervise DSA compliance within the country, with the ACM designated as the national Digital Services Coordinator.
4.2 Investigations and Fines
A major milestone came in December 2025, when the European Commission imposed a €120 million fine on X (formerly Twitter) for DSA violations — marking the first major non‑compliance sanction under the regime. The infractions included misleading design around verification systems, lack of transparency in advertising databases, and failure to provide researchers access to public data. This fine was not only punitive but symbolically significant, demonstrating that the EU will enforce compliance even against powerful global platforms.
4.3 Broader Compliance Challenges and Investigations
Beyond the X case, other major platforms have faced scrutiny:
- Meta and TikTok have been formally accused of breaching transparency obligations, with ongoing investigations into their reporting practices and mechanisms for flagging illegal content.
- In May 2025, the Commission even launched legal action against several EU member states — including Spain, Poland, and others — for failing to establish robust Digital Services Coordinators or penalty frameworks. This underscores that effective enforcement requires both European and national implementation structures.
- Investigations into major adult content sites over inadequate protections for minors highlight how the DSA’s reach can extend to platforms traditionally seen as outside mainstream regulatory focus.
4.4 Ongoing Evaluation and Regulatory Refinement
In November 2025, the European Commission released a report evaluating how the DSA interacts with other EU legal frameworks and confirming that the threshold for VLOP/VLOSE designation remains suitable. This evaluation is a part of continuous refinement to ensure that the rules remain attuned to digital ecosystem changes.
Additionally, standardized transparency reporting templates — aimed at harmonizing how platforms disclose moderation practices — were adopted and rolled out, with the first harmonized reports expected in 2026.
Chapter 5 — Broader Impacts: Society, Innovation, and Global Tensions
5.1 Digital Rights and Public Trust
The DSA has broad implications for user rights and online safety. By requiring transparent content moderation practices, banning harmful dark patterns, and empowering users in advertising choices, the DSA helps to rebuild trust in digital platforms. This is particularly vital in an age of rising misinformation, online harassment, and polarized information environments.
5.2 Innovation and Competition
Although the DSA introduces robust obligations, it also seeks to enable innovation:
- By setting clear, harmonized rules across the EU, it reduces regulatory fragmentation for platforms operating in multiple states.
- Smaller platforms are given breathing room through graduated obligations and exemptions.
- Transparency and accountability requirements can drive competition on trust and ethical service design.
This combination of safeguards and incentives is designed to nurture a diverse ecosystem of digital services that serve users more effectively.
5.3 Transatlantic and Global Reactions
The DSA has not existed in a vacuum. It has sparked debates across the Atlantic and in other jurisdictions — particularly regarding issues like free speech, regulatory overreach, and the global reach of EU law. Some U.S. policymakers have criticized the DSA as incompatible with American free‑speech norms, creating diplomatic and legal tensions over how digital content governance should operate globally.
At the same time, many global digital policy circles view the DSA as a model for how digital regulation might evolve, influencing regulatory discussions in other regions seeking to balance digital growth with rights protections.
Conclusion: The DSA’s Legacy and the Path Ahead
The Digital Services Act is far more than a new regulatory text – it’s a transformative framework that reshapes expectations for how digital platforms operate, how users are protected, and how responsibilities are assigned in the digital ecosystem.

Leave a comment