AI Act & DSA:EU’s Future

EU AI Act and the Digital Services Act (DSA)

The EU AI Act and Digital Services Act: A New Era for Digital Regulation

In 2024, the EU AI Act stirred up discussions about artificial intelligence regulation across Europe. Now, in 2025, the Digital Services Act (DSA) is taking center stage. Together, these laws are reshaping how digital technologies operate in the EU. While they focus on different areas, they are interconnected in many ways.

What is the Digital Services Act (DSA)?

The DSA is a major European law aimed at regulating online platforms and services. It came into effect in 2024 for Very Large Online Platforms (VLOPs). Smaller platforms are gradually adapting to its rules. The DSA focuses on creating a safer and more transparent digital space for users.

Here are its key highlights:

  • Content Moderation: Platforms must remove illegal content quickly and explain their moderation decisions to users.
  • Transparency in Advertising: Online ads must be clearly labeled, and platforms must disclose how they target users.
  • Risk Assessments: VLOPs must evaluate risks related to illegal content, societal harm, and fundamental rights.
  • Algorithm Accountability: Platforms need to explain how their algorithms work, especially those used for recommendations or search results.
  • Protection of Minors: Special measures are required to safeguard children from harmful content and advertising.
  • Extraterritorial Scope: Non-EU companies providing services in Europe must also comply with the DSA rules.

The DSA aims to balance innovation with user protection, making sure platforms are held accountable for their actions.

Comparing the EU AI Act and DSA

Comparing the EU AI Act and DSA

Although the EU AI Act and DSA regulate different aspects of digital technologies, they overlap in several ways. Both laws aim to create a safer and more ethical digital environment while addressing unique challenges. Below is a breakdown of their shared points:

1. Risk Assessment Requirements

Both laws require platforms to assess risks:

  • The DSA focuses on risks like illegal content and harm to society or fundamental rights.
  • The AI Act requires risk assessments for high-risk AI systems used in critical areas like healthcare or law enforcement.

Platforms using AI systems for content moderation or user interaction must comply with both laws’ risk evaluation rules.

2. Transparency Obligations

Transparency is central to both laws:

  • The DSA mandates clear labeling of ads and annual transparency reports from platforms.
  • The AI Act requires users to know when they interact with AI systems, like chatbots or deepfakes.

Platforms combining AI features with large-scale services must meet transparency standards under both frameworks.

3. Regulation of Large Platforms and High-Risk AI Systems

The DSA targets Very Large Online Platforms (VLOPs), while the AI Act governs high-risk AI systems:

  • VLOPs must follow stricter rules for content moderation and algorithm accountability under the DSA.
  • High-risk AI systems must meet technical documentation and user safety requirements under the AI Act.

Platforms that deploy high-risk AI tools (e.g., biometric systems or recommendation algorithms) face obligations from both laws.

4. Extraterritorial Scope

Both laws apply to companies outside the EU if their services impact European users:

  • The DSA ensures global platforms comply with European standards for safety and transparency.
  • The AI Act requires non-EU companies offering high-risk AI systems in Europe to follow its rules.

For example, a U.S.-based social media platform using AI-driven content curation must adhere to both laws when operating in Europe.

5. Harmonization with GDPR

Both laws align with GDPR principles for data protection:

  • The AI Act references GDPR for lawful processing of personal data by AI systems.
  • The DSA ensures platforms handle user data transparently and securely, as per GDPR standards.

Companies need integrated compliance strategies to meet GDPR alongside these new regulations.

6. Content Moderation Meets Generative AI

Generative AI tools are regulated under both frameworks:

  • The DSA requires platforms to remove illegal content created by users or automated systems like generative AI tools.
  • The AI Act imposes transparency rules on generative AI outputs, such as deepfakes or automated news articles.

Platforms using generative AI for creating or moderating content must address obligations under both laws simultaneously.

7. Enforcement Timelines

The enforcement timelines of these laws overlap:

  • The DSA began applying stricter rules to VLOPs in 2024, with smaller platforms following later deadlines.
  • The key provisions of the AI Act will take effect between 2025 and 2027, starting with high-risk systems compliance by August 2026.

Businesses need coordinated compliance plans to handle both sets of requirements efficiently during this transition period.

A New Era of Digital Regulation

A New Era of Digital Regulation

The EU’s layered approach to regulating digital technologies reflects its commitment to protecting users while fostering innovation. By addressing risks from both online platforms and artificial intelligence systems, these laws aim to create a safer digital environment for all Europeans.

For businesses operating in Europe, navigating these overlapping regulations will be challenging but necessary for success in this new era of accountability and transparency.

No data was found

Schedule directly your demo!

Grab a cup of coffee and we will walk you through our tool and answer all your questions.

Do not miss the latest LegalTech news and
e! updates!

Subscribe now to our
monthly newsletter