EU Digital Services Act: Ensuring Online Safety and Fairness
The Digital Services Act (DSA) package, which includes the Digital Services Act and the Digital Markets Act, aims to create a safer digital space where individual fundamental rights are protected and a level playing field is established for businesses. A full introduction to the DSA package, as explained directly by the European Commission, can be found here.
The DSA regulates online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores, online travel, and accommodation platforms. Its main goals include preventing illegal and harmful activities online and combating the spread of disinformation. The DSA ensures user safety, protects fundamental rights, and creates a fair and open online platform environment.
Why did the EU create this legislation?
The DSA is designed to protect consumer rights online by setting clear rules for the behavior of online services and platforms.
The goal of the new DSA rules is that online platforms must implement ways to prevent and remove posts containing illegal goods, services, or content while simultaneously giving users the means to report this type of content. End users should enjoy a safer online experience and the companies operating these services have a more clearly defined set of rules they need to follow.
What types of company does the DSA cover or affect?
To date, the EU has identified 19 platforms and search engines that fall into the category requiring immediate compliance. These platforms, known as Very Large Online Platforms (VLOPs), include:
- Alibaba AliExpress
- Amazon Store
- Apple App Store
- Booking.com
- Google Play
- Google Maps
- Google Shopping
- Snapchat
- TikTok
- Wikipedia
- YouTube
- Zalando
- Bing
- Google Search
For companies not categorized as VLOPs, compliance requirements began in February 2024. The EU will require each of these platforms to update their user numbers at least every six months. If a platform drops below 45 million monthly users consistently, for a year, they will be removed from the list of very large companies that need to comply with the DSA rules.
What are the basic requirements?
The basic requirements focus on greater transparency around how a platform functions, protection from harmful content, and the ability for end users to immediately identify and report illegal or objectionable content.
Meta has published details of how its algorithm works, so the process of recommending posts to users is openly available for both Facebook and Instagram. European users will also be able to view posts chronologically, rather than relying on the algorithm to recommend content.
Google has pointed to many of their existing processes, such as video reporting on YouTube, to demonstrate that they were mostly compliant even before the DSA rules were applied. They have announced more transparency around advertising, so end users can learn why they were targeted for specific ads.
TikTok is also addressing concerns around the recommendation algorithm by allowing users to bypass it, just seeing user posts chronologically. European users aged 13-17 will also no longer be targeted with personalized advertising inside the app. Snapchat has similarly announced that users in Europe will be able to opt out of all personalized feed updates and recommendations.
What are the penalties?
Supervision of the rules and subsequent penalties have yet to be shared. The European Commission will share this responsibility with member states, but the states have been encouraged to create or nominate Digital Service Coordinators, ensuring that there is a dedicated organization in each state responsible for DSA compliance.
While the DSA is evolving, failure to comply can already be costly. The European Commission guidelines state: “The Commission has the same supervisory powers as it has under current anti-trust rules, including investigatory powers and the ability to impose fines of up to 6% of global revenue.”
Six percent of global revenue—not just European revenue—could be a significant amount for these very large companies so preparation and compliance are essential.
Some companies are already pushing back. Europe’s largest online fashion retailer, Zalando, is suing the European Commission over their inclusion in the first wave of DSA compliance. Amazon has also asked the Commission to think more carefully about the classification of which companies need to comply.
What can you do to be prepared?
Compliance is now essential for the very large companies, but slightly smaller companies that are still planning how they can be ready for the DSA need to consider these areas of their business:
- Content moderation: There are no specific timeframes around content removal, but no company will want to ask a court to define a ‘reasonable’ amount of time to remove an offensive video from their site. Content moderation needs to be taken far more seriously—it is the responsibility of any site that allows users to contribute content to protect all their users.
- Proactivity: Companies are legally protected from user generated content and user behavior unless they are aware of the specific user actions and fail to act. As such, a far more proactive approach to user safety is encouraged; this diligence could be considered a likely candidate for artificial intelligence systems that can be always on and always watching.
- Notice and action: Users must have the option to notify the service of dangerous or offensive content and action must be triggered when a report is made.
- Transparency: Social networks must be open about why they recommend certain posts to users; business marketplaces need to be open as to why they are accepting suspicious business transactions. Transparency needs to become a cultural value and a normal expectation for all users of these services.
Very large companies already need their compliance plans to be in place, and for smaller organizations, there’s a need to start exploring these changes now, with trusted advisory experts that have experience of the compliance framework and the required business processes.
What happens after February 2024?
The obligation for greater transparency came into effect in February 2023. In the summer of 2023, other DSA provisions came into effect for specific parties, including VLOPs and VLOSEs (Very Large Online Search Engines).
For these parties, the compliance measures laid out under Art. 11-43 of the DSA were brought into force, including:
- Risk mitigation measures
- Mandatory risk assessment
- Due diligence requirements
As the act comes into full effect this year, digital service providers will have to move from an implementation phase into one that complies with all aspects of the DSA package. Requirements now includes providing clear and transparent terms and conditions for users, implementing appropriate content moderation policies, and complying with any additional reporting or compliance requirements as established by the DSA.
February 17, 2024, is when the full implementation of the DSA rules came into effect. From this date forward, all digital service providers are expected to comply with the new regulations.
What can you do?
If you are not familiar with many of these legislative frameworks, don’t worry…Alorica already has extensive experience working with social media and platform companies on their Trust & Safety strategies, including content moderation.
We have followed the introduction of these rules over the past few years and our procedures are compliant with all the published safety legislation globally. If you want to protect your company, and your customers, and you need rapid advice on how to avoid non-compliance then contact our team for more information.