top of page
  • Writer's pictureBenji Loney

How to Be Compliant with The Digital Services Act: 5 Critical Steps

Coming to the scene with a slew of other online safety bills, the Digital Services Act (DSA) is a landmark legislation within the European Union that gives more accountability to digital service providers regarding the content that appears on their platforms.


The big players (Facebook, X, Pinterest, etc.) have been dealing with the Digital Services Act for a few months already, so we've got a pretty good handle on what it involves, the responsibilities it brings, and what it takes to be compliant.


If you’re an online platform, you will have to comply with the DSA by Feb 17th 2024.


But what does it take to be compliant?


Today we’ll walk you through the five critical steps for compliance with the Digital Services Act, these steps are:


  1. Your user’s ability to flag/report content

  2. User communications once content is reported

  3. How to take action on reported content

  4. User communications once action is taken on the content

  5. Data and transparency


Before we dive in, we’ll answer two questions:


  1. How do I know if I’m Subject to the Digital Services Act?

  2. What happens if you don’t comply with the DSA?


Let’s get started.



How do I know if I’m subject to the Digital Services Act?


We covered this topic at length on a previous blog, but basically, if you’re an intermediary service you have to comply with the Digital Services Act.


However, there are several categories of online services (see link above), and as you move up the ladder the regulatory impact of the DSA increases.


It is critical to understand what tier you are in, and particularly if you’re a hosting service, and whether you’re considered an online platform.


A major factor to check is: do you have any product features that allow users to upload and/or share their own content? Some examples: profile pics, bios, comments, posts, goods, pictures, templates, files, etc


If you’re still scratching your head, we’ve put together a DSA audit team to help answer all your questions and help you understand what aspects of the Digital Services Act apply to you.



If you’re sure that you need to comply, keep reading.



What happens if you don’t comply?


Much like the GDPR, and other hefty regulations that have hit the tech industry lately, the Digital Services Act can have a big impact on your revenue.


Here's how:


The DSA gives European regulators the power to impose fines based on your global revenue, and these fines can go as high as 6%.


But, for the first time, It also opens the door for any internet user to take action if they stumble upon illegal content on your platform. They can raise complaints through official third-party non-legal entities, outside of the court system, and seek compensation from you.


In a nutshell 👉 you've got the regulators keeping an eye on your compliance AND a growing ecosystem where users can file complaints. This could lead to some heavy financial penalties. It's definitely something to take seriously.



So, if you’re a hosting service or online platform, let’s go through the five critical steps for DSA compliance.


These are the high-priority steps for Digital Services Act compliance. If you want the full scoop (which you definitely should after reading this) you can grab the complete checklist for DSA compliance here.




The Five Critical Steps for Digital Services Act Compliance


1) Your User’s Ability to Flag/Report Content


The way that content gets reported on your platform needs to be updated to DSA standards.

There are three key things you need to do:


1. You must have a specific reporting option for illegal content under the DSA


2. Users must be able to report content even if they are not logged in

3. You must have a DSA-compliant reporting form



> Notice how you can report content without being logged into Pinterest + a special category of complaint.



> When selecting the option to “report Pin for EU local law violation” a user (or non-user) gets taken to a specific form for EU Illegal Content.


Get to know our regulatory compliance software and learn how to be compliant with the Digital Services Act.



2) User Communications Once Content is Reported


This is the second step for DSA compliance, and the first step in your mandatory Notice & Action Flow.


The Notice & Action Flow deals with the communication and content moderation actions taken from the moment a user reports content until the content gets removed/not removed.


Here’s a birdseye view of this flow:

This means that in step two of the five critical steps, “user comms once content is reported,” you have to make sure that both the platform (you) and the flagger/reporter get an email notification as soon as content gets reported.


This is what an email to a flagger might look like:


Easy – but with the number of reports you will be getting, you should probably set up an automatic flow for this (we can help).


Now, let’s look into the next steps in your Notice & Action Flow: moderating the content.



3) How to take action on reported content


Once you receive a user report, you must moderate the content to decide whether it’s illegal, and should be removed, and whether action needs to be take on the user who published the content (for example: Suspending their account)


You can use a mix of AI and humans during this part of the process – which we highly advise you to do, since you might be getting hundreds, if not thousands of user reports to process per month.


The other key thing is that you need to make decisions quickly (we advise no longer than 72 hrs after the content gets reported). This can be tricky given the nuances of local law and languages.


Here’s an excerpt from Article 14.6 (Notice and action mechanisms):



Once you’ve decided what to do about the report, it’s time to kick off another round of communications with the users.




4) User communications once action is taken on the content


The last part of your Notice and Action flow is user comms once you’ve moderated the content.


Two things can happen here:


If the content is removed (or any other action is taken):


You need to send off another set of user comms to the flagger and the content creator with compliant language.


If the content was not removed:


The content creator never needs to be notified, but the flagger needs to be notified that the content was reviewed and why it wasn’t removed.


Either way, you must provide an appeals flow, and upon completion of the appeals flow, provide details to access other mandated redress mechanisms.


This Notice and Action flow is an ongoing process that ideally requires a careful mix of human and machine – there are a few ways you can approach this process (and the DSA as a whole), and we are happy to help you understand the best process for you. Let’s chat?


The final step for DSA compliance is all about providing data on your reports.




5) Data and transparency


The final critical step in ensuring compliance with the Digital Services Act is centered around two key concepts: data collection and transparency.


This step is crucial in demonstrating your platform's commitment to adhering to the DSA's guidelines and maintaining public trust.


1) Collect the Right Data


Keep track of things like:


  • Timestamps: When did users report content?

  • Content Categories: What kind of stuff are people flagging?

  • Location, Location, Location: Where are these reports coming from?

  • Action Stats: How much of this flagged content did you actually do something about?

  • (Full list here)


This isn't just busywork. This data is gold for keeping your compliance strategy in check and ensuring user safety.


It can be tricky to manage the storage of this data across multiple content types (profiles, posts, comments, etc), report types (proactive, reactive ToS, Notice and Action complaints, etc), and databases, but critical to do so for the next step.


2) Transparency Reports


You've got to whip up Transparency Reports that spell out:


  • What You Found: Share the juicy details from your data collection.

  • What You Did: How did you handle these reports?

  • EU Commission Love: These reports need to be up to EU Commission standards. They're not just for show; they've got to be accurate and clear.


You can check out some of the Transparency Reports that have been published by Facebook, Amazon and Pinterest. This should give you a good starting idea of what you need to include.



Where you go from here


The Digital Services Act (DSA) is not just another regulatory hurdle to clear; it's a vital part of ensuring a safe and responsible digital environment in the EU.


By following the five critical steps we've outlined - from enhancing user reporting mechanisms to ensuring transparent data reporting - your platform can not only comply with the DSA but also foster a more trustworthy and secure online space for users.


Remember, the deadline for DSA compliance is February 17th, 2024, so the clock is ticking.


Don't let the complexities of compliance overwhelm you. Our comprehensive DSA Compliance Checklist guides you through every step of this journey.


There are also great 3P tools and products to make compliance easy, including TrustLab’s Compliance Suite, which handles the complexity for you at an unbeatable value.


And if you're looking for a more in-depth understanding and personalized guidance, our DSA Audit is just a click away.



Recommended Reading:



234 views2 comments
bottom of page