What is the Digital Services Act (DSA) and How Does It Impact Me?

← Blog Home
Table of Contents

The Digital Services Act (DSA) is legislation by the European Commission aimed at regulating a variety of digital services within the European Union (EU). The DSA is part of a broader package of EU regulations called the Digital Services Act Package, which also includes the Digital Markets Act (DMA). These acts seek to address various challenges associated with the digital economy, such as the dominance of large online platforms, online misinformation, and harmful content.

The main points of the DSA are as follows:

  • Categorization of Digital Services: The DSA introduces a classification system to categorize digital “intermediary services” based on their size and impact. Online platforms are divided into three categories: (a) Very Large Online Platforms or Search Engines (VLOPs; VLOSEs), (b) Online platforms (LPs), (c) Intermediary services, and (d) Web hosting services. The obligations and requirements for each category differ based on their size and role in the digital ecosystem which you can read about here.  
  • Liability and Content Moderation: The DSA establishes a regulatory framework for the liability of online platforms for the content they host and enforces obligations for platforms to take measures to address illegal content, such as hate speech, terrorist content, and counterfeit goods. The act emphasizes transparency in content moderation processes and provides mechanisms for user appeals in case of content removal or account suspension. Every online platform is required to establish a direct and efficient single point of contact for users to communicate electronically. Platforms cannot rely solely on automated tools to fulfill this obligation.
  • Transparency and Algorithmic Processes: The DSA aims to improve transparency around a platform’s content moderation processes, as well as the algorithms used by online platforms. Very Large Online Platforms are required to provide more information about how their algorithms work, particularly in relation to content curation and promotion. This measure aims to address concerns about the impact of algorithms on content visibility and dissemination. Further, intermediary services must furnish all restrictions imposed in their Terms of Service, Terms of Use, or Terms and Conditions. This includes content moderation policies, user-affecting operations, distinctions between algorithmic decision-making and human review, and the processes for handling complaints within their internal systems. The DSA explicitly emphasizes the need to balance “freedom of expression”, “freedom and pluralism of the media”, and “other fundamental rights”. It requires platforms to be transparent in their approach, and ensure automation efforts do not encroach upon these rights of EU citizens.
  • Independent Auditing and Compliance: The DSA introduces the concept of independent audits for Very Large Online Platforms to assess their compliance with the act's requirements. These audits will be conducted by third-party auditors to ensure platforms adhere to their responsibilities.  In addition, platforms will need to submit periodic (frequency depending on their size) Transparency Reports with very specific data about the number of complaints received, orders received from Member States, and other data.
  • Enhanced Safety Measures for Users: The DSA includes provisions to protect users from harmful content, illegal goods, and services. It requires online platforms to have systems and policies in place to prevent, detect, and remove such content, as well as mechanisms to report and appeal content removal decisions.
  • Enforcement and Penalties: The DSA proposes a system of sanctions and penalties for non-compliance by online platforms. Penalties may include fines of up to six percent  of the platform's annual global revenue.
  • Cooperation and Coordination among EU Member States: The DSA aims to foster better cooperation among EU Member States in enforcing the regulations and handling cross-border issues related to online platforms. Therefore, each member state will be required to have a Digital Services Coordinator by February 17, 2024. These coordinators are meant to serve as enforcers of DSA within their country.

Meet the Author

Jessica Dees

Jessica is the Director of Trust & Safety Policy and Operations at TrustLab.

Let's Work Together

Partner with us to make the internet a safer place.

Get in Touch