EU - The DSA: A new era for online harms and intermediary liability

The EU’s proposed Digital Services Act was adopted by the European Parliament on 5 July 2022. While it must still be formally adopted by the Council (expected in September 2022) and then published in the Official Journal, its contents are now largely finalised.

The Act marks the biggest shake up to the rules for online intermediary liability in 20 years and sets out the EU’s stance in the global online harms debate. This involves the fraught attempt to reconcile the damage caused by unregulated user-generated content, fundamental rights to freedom of information and the practical limitations of moderating content at scale.

The new and wide-ranging obligations, particularly the obligations on very large platforms to assess and mitigate risks on their platform, are likely to shape the global approach to content regulation in this emerging area of law.

A determination to regulate digital spaces

The Digital Services Act is just part of a wider programme to reform digital regulation notable not just for its ambition but also for the speed of change. The Act has moved at breakneck speed. Political agreement was reached only 18 months after it was proposed in December 2020 and it is likely to start applying to large social media platforms and search engines in 2023, with the remaining provisions applying in 2024.

Whilst being described as an “Act”, this is actually an EU Regulation so will apply directly to every Member State. It imposes tiered obligations so that the more complex and larger the service, the greater the obligations. Those obligations are described below and summarised in the table here.

Intermediaries still protected but subject to important new obligations

The Digital Services Act preserves the “hosting”, “caching” and “mere conduit” defences in the eCommerce Directive.

This includes prohibiting any general monitoring obligation from being imposed on these intermediary service providers and preserving the existing “notice and takedown” process - whereby a hosting provider will only become liable for illegal content if, on obtaining actual knowledge of the illegality, they fail to remove or disable access to the content expeditiously.

These intermediary service providers will also benefit from a new “Good Samaritan” provision under which they will not lose these defences simply because they undertake own initiative investigations to identify and remove illegal content.

The preservation of the hosting defence and other intermediary protections is extremely welcome but will now be subject to significant new obligations under the Digital Services Act. All service providers (including those only providing mere conduit and caching services) must comply with the following requirements:

  • Single point of contact: Reflecting the fact that some service providers can be difficult to identify and contact, they must provide a public “point of contact” so they can be contacted by other authorities and users.
  • EU representative with direct liability: If the service provider is based outside the EU, it must also appoint a legal representative in the EU. This sounds similar to the EU representative concept in the GDPR. However, under the Digital Services Act, that representative can be held directly liable for breaches. Given the potentially punitive sanctions (below), this is not a role to be undertaken lightly.
  • T&Cs and disclosure obligations: The service provider must set out in the T&Cs any restrictions on the service, alongside details such as content moderation measures and algorithmic decision making.
  • Annual transparency reports: The service provider must issue an annual transparency report on matters such as content moderation measures and the number of take down and disclosure orders received.
  • Take down and disclosure orders: Service providers that receive take down and disclosure orders from judicial or administrative authorities must notify the authority of any action taken (though there is no actual obligation to comply with the order). If the service provider takes down the content or discloses information, they must notify the user.

In addition, hosting providers are subject to additional obligations to:

  • Receive take down requests: Anyone should be able to notify the hosting provider of illegal content. The hosting provider must process that notice diligently and report back on whether the content was removed.
  • Notify users of takedowns: Hosting providers must notify users if they remove content. This also includes demoting or restricting the visibility of the content (e.g. “shadow bans”) and the notification should include details of whether the decision was taken using automatic means (e.g. based on machine learning classifications).
  • Report illegal content: Hosting providers must inform the judicial authorities if the hosted content creates a suspicion that a criminal offence has occurred – but this is limited to offences involving a threat to life or safety.
Significant changes for online platforms – Online harms v freedom of information

The more important changes in the Digital Services Act apply to providers of “online platforms”. These are hosting providers who, on behalf of a user, store and disseminate information to the public, unless this is a minor ancillary feature. This includes social media services and online marketplaces (but is unlikely to include private messaging services).

Any attempt to regulate user-provided content is fraught with difficulties and raises difficult questions about the balance between fundamental rights to freedom of information, the impact of online harms and the practical limitations attempting to moderate content at scale.

The Digital Service Act appears to take a relatively non-interventionist stance. Save for large platforms (below) there are very limited obligations to police content on the platform. Instead, the new regime appears to have more of a bias to protect content by giving users a right to complain against the removal of content, and even use an out-of-court appeals process if they are unhappy with the platform’s handing of that complaint. This will be a significant change for many platforms who will have to be much more transparent about their moderation processes and may need significant additional resources to deal with subsequent objections and appeals from users.

Alongside these changes are other significant new obligations, including:

  • Dark patterns: Platform providers cannot use interfaces that manipulate or distort the choices taken by users.
  • Suspension of repeat offenders: Where a user continues, after being warned, to “frequently” provide illegal content, the platform provider must suspend them for a reasonable time.
  • Disclose EU MAU: The platform provider must disclose the number of monthly active users in the EU.
  • Ads and recommender system transparency: The platform provider must provide users with information about adverts they are shown including the reasons why that ad was selected for them. Similarly, the platform provider must be transparent about the operation of any recommender system – i.e. the system used to automatically select content for the user to view.
  • Trader verification: The platform provider needs to ensure traders on the platform identify themselves and make best efforts to verify that information is correct. The platform must also design its services to enable those traders to comply with their legal obligations.
  • Online protection of minors: Providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors.

While there are some exemptions for small and micro-enterprises, this will be a significant technical and operational lift for many organisations.

Additional obligations for the very largest platforms

The highest tier of regulation applies to:

  • VLOP: These are very large online platforms which have over 45 million monthly active users in the EU and are designated as such by the European Commission.
  • VLOSE: These are online search engines which have over 45 million monthly active users in the EU and are designated as such by the Commission.

This designation brings with it some of the very strongest obligations in the Digital Services Act. This includes obligations to conduct a risk assessment of their services and to take steps to mitigate any risks identified as part of that process.

To provide additional reassurance that appropriate measures have been taken, these entities must also create an independent compliance function and must commission an annual independent audit.

All of this is then backed up by stringent transparency requirements. This including publishing the independent audit report referred to above alongside other information such as details of the amount of resource dedicated to content moderation. These very large providers must also provide information (including the design, logic the functioning and the testing of their algorithmic systems) to regulators and vetted researchers to allow them to monitor and assess compliance with the Act.

Enforcement

Breach of the Digital Services Act comes with the now customary turnover-based fines. It can be punished by a fine of up to 6% of annual worldwide turnover, and users also have a right to compensation for any damage or loss suffered as a result of a breach.

The Act also allows for enforcement both by national regulators, known as Digital Service Coordinators, and directly by the European Commission. The European Commission will have competence in respect of VLOPs and VLOSEs, with specific rights for the European Commission to charge a fee to cover its supervisory activities.

Get ready for 2024 (or 2023 for Big Tech)

For most entities, the obligations in the Digital Services Act will apply from the later of 15 months after the Act comes into force and 1 January 2024.

However, it might start to apply to VLOPs and VLOSEs from an earlier date as the Act will apply to them four months after being designated as such by the European Commission. Given the Commission is expected to start the designation process soon after adoption, the Act is expected to these entities in 2023.

Wider changes and emerging global content regulation laws

The Digital Service Act forms part of a wider reformation of the regulation of digital technology. Alongside the very significant changes it will bring are radical changes to the competition landscape in the Digital Markets Act, reforms to the use of data in the Data Governance Act and new cyber obligations in the NIS2 Directive (all which have recently reached political agreement or been adopted). Slightly further out are other major reforms, such as the Data Act, the AI Act and Health Data Space Regulation (which are still working their way through the legislative process).

The emergence of content regulation laws is not just limited to the EU. There has been an explosion of new laws across the globe including India’s Digital Media Ethics Code, Singapore’s Protection from Online Falsehoods and Manipulation Act and the UK’s Online Safety Bill. Content regulation laws have even been proposed in the US, including the Texian HB 20 law and Florida’s Stop Social Media Censorship Act. However, these laws have generally failed as a violation of the social media companies’ strong rights to freedom of speech under the First Amendment.

While societal factors (such as views on privacy and freedom of speech) have a significant impact on online harms laws, the EU’s position as a “regulatory supertanker” means the Digital Services Act is likely to play a pivotal role in the development of other content regulation laws around the world.

What about the UK post-Brexit?

The Digital Services Act will not apply to the UK. However, the UK has chosen to implement similar obligations by way of its Online Safety Bill, which was introduced to Parliament in May 2022. This includes placing significant emphasis on conducting risk assessments against a range of different harms.

The Online Safety Bill appears to take a more interventionist approach compared to the EU Digital Safety Act, including the ability of Ofcom (the UK regulator) to mandate the use of specific content moderation technology. The ‘bells and whistles’ attached to these laws are also very different. For example, the Online Safety Bill contains a range of collateral obligations such as giving users the option to verify their identity and imposing obligations on pornographic websites to block access by children.

This means that while there will be a lot of overlap between the work to prepare for the EU Digital Services Act and the UK’s proposed online safety regime, there will also be a lot of significant differences. This is consistent with the wider global picture as more and more states enact content regulation laws – whilst there are often common themes to this law, the overall picture looks like it will be fragmented and inconsistent.

The text adopted by the EU Parliament is available here.