In the first part of our article, we introduced the European Union’s Digital Services Act (DSA) and its significance for Legaltech platforms. We outlined the general obligations applicable to all digital service providers, emphasizing the importance of transparency, accountability, and user protection. Key topics included the appointment of a legal representative within the EU for non-EU companies, establishing clear points of contact for users and authorities, and ensuring transparent and user-friendly terms of service.
We also discussed the necessity of notifying users about significant changes to terms and enforcing these terms fairly, upholding fundamental rights such as freedom of expression.
Building upon these foundational responsibilities, we now delve into more specific compliance requirements under the Digital Services Act (DSA) that Legaltech platforms must address to fully align with the new regulatory standards.
Transparency & Platform Accountability
Transparency & Platform Accountability under the Digital Services Act (DSA) focuses on ensuring that digital service providers operate openly and responsibly. It requires platforms to clearly disclose how their systems work—such as recommender algorithms and advertising practices—and to publish regular transparency reports about content moderation and enforcement actions.
These rules also aim to prevent deceptive design (like dark patterns) and ensure that users are fully informed about why they see certain content or ads. The goal is to build trust, empower users, and make platforms more accountable for their impact on digital society.
Transparency Reporting:
Do you publish an annual transparency report about your content moderation activities, written in clear language and in a publicly accessible format?
(This report should include metrics like the number of removal orders received, user notices processed, content removed, appeals handled, and the use of automated tools.
Online platforms:
should include additional details such as the outcomes of disputes and actions against misuse.) (Article 15 DSA; Article 24 DSA)
Recommender System Transparency:
If your service uses automated recommender systems (e.g. to rank, curate, or suggest content to users), do your terms of service explain in plain terms the main parameters used by those systems and how they impact what users see? (Article 27 DSA)
Advertising Disclosure:
If your service displays online advertisements, do you clearly indicate to users that a piece of content is an advertisement, identify the person or entity on whose behalf the ad is shown (and who paid for it, if different).
As well, provide meaningful information about the targeting criteria used to show that ad (along with a way for users to change those ad parameters)? (Article 26(1) DSA)
User-Provided Ads:
Can users post content that is commercial in nature, such as paid or sponsored content?
Do you provide a mechanism for users to declare that their post is an advertisement or contains commercial communications?
Do you then clearly inform other users that the content is commercial? (Article 26(2) DSA)
No Dark Patterns:
Is your online interface free from design features that deceive or manipulate users? (For example, you do not use layouts or workflows that trick users into unintended actions or make it unreasonably difficult to cancel a service.) (Article 25 DSA)
- Does the platform maintain detailed records of content moderation decisions and legal compliance processes for regulatory audits?
- Are these records structured in a way that facilitates external regulatory inspections and internal compliance reviews?
- Does the platform ensure secure storage and appropriate access controls for these records, preventing unauthorized modifications?

Protections for Minors & E-Commerce
Protections for Minors & E-Commerce under the Digital Services Act (DSA) are designed to enhance safety for young users and ensure trust and accountability in online marketplaces. Platforms must implement special safeguards for minors, including clear and age-appropriate terms, restrictions on targeted advertising, and enhanced privacy settings.
For e-commerce, online marketplaces are required to verify the identity of traders, provide transparent product information. They shall also notify consumers if they’ve purchased illegal goods. These rules aim to protect vulnerable users and ensure that digital transactions within the EU are safe, fair, and transparent.
Minor Safety and Privacy:
If minors are likely to access your service, have you implemented appropriate measures to protect them?
Do these measures ensure a high level of privacy, safety, and security for minor users? Do you avoid showing targeted ads based on profiling when you know a user is a minor?
If your service is directed at or used by minors, do you present your terms and policies in a way children can understand? (Article 28 DSA; Article 14(3) DSA)
Trader Traceability:
If your platform allows consumers to buy goods or services from third-party sellers (making you an online marketplace), do you collect and verify essential information about those traders before they can sell (including name, address, contact details, ID or registration number, payment details, and a self-certification of legality)?
Do you display the trader’s name and contact information clearly to consumers? (Article 30 DSA)
Design for Product Information:
Are you operating an online marketplace?
Does your interface allow traders to provide all required product or service information to consumers? Is it enabling them to include contact information, product descriptions, terms of sale, and any required warnings?
Does your interface clearly display the trader’s identity, such as their business name or logo? (Article 31 DSA)
Informing Buyers of Illegal Products:
If you discover that an illegal product or service was sold via your platform, do you notify all affected consumers?
Do you inform consumers who purchased it within the past 6 months? Would you provide them with the identity of the seller?
Do you also give information on available remedies or steps for redress?
(Article 32 DSA)

Additional Obligations for Very Large Online Platforms
(The following section is applicable only if your service has over 45 million active monthly users in the EU and is designated as a “Very Large Online Platform” under the DSA.)
Systemic Risk Management:
Have you conducted a thorough assessment of systemic risks on your platform — such as the dissemination of illegal content, threats to civic discourse or electoral processes, public health risks, or violations of fundamental rights — and have you taken reasonable measures to mitigate those identified risks? (Articles 34–35 DSA)
Crisis Response Plan:
Do you have the capability to deploy special measures to mitigate unforeseen crises affecting public security or health, if requested by the European Commission’s crisis protocol? (Article 36 DSA)
Independent Audit:
Do you undergo an independent annual audit of your platform’s compliance with the DSA requirements, and do you make the summary of the audit report public? (Article 37 DSA)
Recommender Choice:
Do you offer users at least one option to use your content recommender system that is not based on profiling (i.e. not based on personal data tracking)? (Article 38 DSA)
Ad Repository:
Have you built a publicly accessible database that includes all advertisements served on your platform?
Does the database detail the ad content, the advertiser’s identity, and the payer? Does it include the targeting criteria and the date the ad was displayed?
Have you ensured the database contains no personal user data? (Article 39 DSA)
Data Sharing with Authorities/Researchers:
Do you have procedures to share necessary data with Digital Services Coordinators, the European Commission, or vetted academic researchers upon proper request, so they can monitor compliance and study systemic risks on your platform? (Article 40 DSA)
Compliance Function:
Have you established a compliance function at the management level (e.g. appointed a compliance officer or team) to ensure ongoing adherence to the DSA? Does this function have sufficient authority and resources within your organization? (Article 41 DSA)