EU Data and Digital Drive: 10 Things to Know About the Digital Services Act

 
February 17, 2023

A key pillar of the EU’s overhaul of the digital economy, the Digital Services Act (“DSA”), aims to harmonise rules for online intermediaries. It includes numerous new obligations for those businesses in scope which scale up depending on the size of and risk of activity undertaken by the business. Businesses should review the extent to which the DSA applies to their operations and begin to consider changes that may be needed.

This is the second OnPoint in Dechert’s EU Data and Digital Drive series, covering the Digital Services Act. To read our previous OnPoint providing an overview of forthcoming legislation, see here.

1. Scope

Similarly to the General Data Protection Regulation (“GDPR”), the DSA has extra-territorial effect and non-EU established businesses will also be subject to the DSA to the extent that they offer intermediary services to recipients established or located in the EU. Its scope of application includes online marketplaces, social media platforms, app stores, cloud providers, and search engines. The legislation differentiates between conduit providers, caching services and hosting services, with the focus firmly on online platforms (such as social media platforms and marketplaces).

There is no general de minimis exclusion, but micro and small enterprises are excluded from certain sections of the DSA (including transparency reporting obligations and all obligations on online platforms, provided that they have not been designated as a VLOP). Meanwhile, very large online platforms and very large online search engines (those which have at least 45 million average active monthly users in the EU, and have been designated as such by the European Commission) (“VLOPs” and “VLOSEs”), will be subject to additional obligations.

Despite the name, the DSA is a regulation – meaning it applies directly across all 27 EU Member States and supersedes any overlapping national laws. Moreover, Member States cannot go beyond the requirements in the DSA as this is a full harmonisation instrument. Businesses can therefore roll out their compliance programmes across the whole EU.

2. Timing

The DSA applies to businesses from February 17, 2024, although VLOPs and VLOSEs will be subject to its rules from four months after they are notified of their designation as such, if earlier. In addition, there is an earlier deadline of February 17, 2023 for online platforms to publish information on their average monthly active recipients in the EU. The European Commission is also inviting online platforms to notify the published numbers to it. The Commission will then make its VLOP and VLOSE designation assessments based on these numbers.

3. Enforcement

The DSA adopts a unique oversight structure in that the European Commission will be the primary regulator for VLOPs and VLOSEs with enforcement powers similar to those it has under anti-trust proceedings, while other providers will be under the supervision of national Member State authorities known as Digital Services Coordinators. As such, although the law itself will be fully harmonised, enforcement may still vary between Member States.

Continuing the GDPR’s theme of eye-watering potential fines, the potential fines under the DSA are up to 6% of annual worldwide turnover. In addition, VLOPs and VLOSEs could face daily penalty payments of up to 5% of their average daily income or worldwide annual turnover. Among other enforcement powers, in serious cases, regulators can ask a national court for a temporary suspension of the provider’s service.

4. Obligations scale up and are cumulative

The DSA includes a set of ‘base’ obligations which apply to all intermediary service providers. On top of this, there are further obligations on those providing hosting services. There are then additional obligations on online platforms, and even further obligations on VLOPs and VLOSEs. The European Commission website sets out these obligations in a table format that businesses may find useful.

5. Transparency

Transparency is a key theme of the DSA with all intermediary service providers required to publish reports on content moderation at least once a year (every six months for VLOPs and VLOSEs), including information on their self-moderation and number of complaints. There are additional information requirements for online platforms and VLOPs and VLOSEs. Online platforms must also continue to publish information on their average monthly active recipients in the EU at least once every six months.

6. Accountability

The DSA contains numerous accountability measures, including designation of a point of contact whose contact details are to be made publicly available and easily accessible. Intermediary service providers which are not established in the EU are also required to appoint a representative in an EU Member State where they offer their services. Note that the DSA explicitly states that this representative can be held liable for non-compliance by the provider, so availability of representatives may be more lacking than under the GDPR, or at least may be subject to more stringent indemnity or insurance discussions.

7. Documentation and Processes

The DSA imposes numerous requirements for providers’ terms and conditions and other documentation and processes, including the following.

All intermediary service providers:

  • to update terms and conditions to provide information on their restrictions for content as well as their moderation and complaints processes;

Online platforms:

  • to provide a complaints-handling system and submit to out of court dispute settlement procedures (although these are not binding);
  • to update terms and conditions to explain the circumstances in which they may issue a suspension;
  • using recommender systems (i.e. systems which determine the order in which information is presented on a platform) must explain the main parameters used and present any options for users to modify or otherwise influence the function;

VLOPs and VLOSEs:

  • to publish their terms and conditions in the official languages of all EU Member States in which they offer services, and provide recipients with a summary of their terms and conditions;
  • must comply with any decision of the European Commission with respect to crisis response (for example, combatting misinformation in the context of the current situation in Ukraine);
  • to create and maintain a searchable repository of adverts including information about the content of the ad, when it was presented, and any targeting;
  • to undertake annual systemic risk assessments relating to illegal content, various fundamental rights and intentional service manipulation by users (in areas such as public health and electoral process). Risk mitigation measures and independent audits must also be implemented;
  • will be required to pay a supervisory fee to the European Commission to help enforce the DSA. The methodology by which such fee will be calculated is still in draft form.

8. Content Moderation

The liability exemptions for intermediary service providers originally outlined in the e-Commerce Directive 2000 remain largely unchanged (i.e. providers are exempt from liability for any user-generated information or content hosted on or transmitted via that provider’s platform, provided certain conditions are satisfied). Active searching for illegal content is still expressly excluded (although providers do not lose the benefit of the exemptions if they decide to carry out their own investigations). Also remaining is the “notice and takedown” system for hosting providers whereby they only become liable if they fail to remove or disable access to illegal content expeditiously on obtaining actual knowledge.

However, the DSA does impose additional content moderation obligations on intermediary service providers, including the following.

Hosting service providers:

  • to provide mechanisms for parties to submit notices of illegal content;
  • to provide explanations to anyone affected by restrictions as a result of their uploading illegal content or content that is incompatible with the provider’s terms and conditions;
  • to notify relevant authorities of suspicions of criminal offences involving a threat to life or safety;

Online platforms:

  • to prioritise notices submitted by so-called “trusted flaggers” (who are to be awarded that status by the Digital Services Coordinator of the Member State where they are established);
  • to suspend (i) recipients that frequently provide manifestly illegal content, and (ii) complainants/notice providers who frequently submit notices/complaints that are manifestly unfounded, in each case after having given them a prior warning (although there is no indication of what the time period should be between a warning and issuing a suspension).

9. Design

In keeping with the GDPR theme of requiring privacy by design, the DSA requires intermediary service providers to take certain steps in relation to the design of their offerings, including the following:

Online platforms:

  • are prohibited from using dark patterns (for example, “nudging” a user to make a certain choice by using particular colour and/or size emphasis) in their design, organisation and operation of their online interfaces;
  • are prohibited from conducting targeted advertising based on profiling using data of minors or special category personal data (such as religion or sexual orientation);
  • to provide clear labelling of and information about adverts, including the factors determining the showing of the advert to a recipient;
  • must put in place appropriate and proportionate measures to protect minors where their platform is accessible to minors.

VLOPs and VLOSEs:

  • are required to provide at least one option for recommender systems which is not based on profiling.

The European Commission has stated that the DSA complements the GDPR in introducing new explicit restrictions on targeted advertising and dark patterns, and stressed that the DSA does not modify the existing rules on data protection such as in the GDPR or ePrivacy Directive.

10. Online Marketplaces

Online marketplaces must obtain certain information about traders, some of which must be made publicly available. As well as designing its online interface to allow traders to comply with their own obligations regarding information that must be provided in relation to their products/services, there is also an obligation on the platform to make best efforts to assess whether traders have provided that information before allowing them to offer on the platform. Following this, the platform must still use reasonable efforts to conduct random checks on whether products or services have been identified as illegal and, if it becomes aware of the offering of an illegal product or service, inform consumers who have purchased that product or service within the previous six months of that illegality.

Subscribe to Dechert Updates