Series
Blogs
Series
Blogs
The EU’s Digital Services Act (DSA) marks the biggest shake up to the rules for online intermediary liability in twenty years.
The Digital Services Act (DSA)[1] was approved by the Council of the European Union (EU) in September 2022[2] and published in the Official Journal on 27 October 2022.[3]
It marks the biggest shake up to the rules for online intermediary liability in twenty years. The DSA is accompanied by flanking instruments regulating terrorist content, child sexual abuse material (CSAM) and political ads which, together, will create an entirely new framework for the regulation of online harms in the EU.
These new and wide-ranging obligations will attempt to reconcile the damage caused by unregulated user-generated content, fundamental rights to freedom of information, and the practical limitations of moderating content at scale. They will not only be hugely significant to citizens’ online experience within the EU but are also likely to shape the global approach to content regulation in this fast-emerging new area of law.
The DSA is part of a package of regulation proposed by the European Commission (Commission) to reform digital regulation. This package is notable not only for its ambition but also for the speed of change. The DSA has moved at breakneck speed. Political agreement was reached only eighteen months after it was proposed by the Commission in December 2020, and it is likely to start applying to large social media platforms and search engines in 2023, with the remaining provisions applying in 2024.
Whilst being described as an ‘Act’, this is actually an EU Regulation so will apply directly to every Member State. The DSA imposes tiered obligations so that the more complex and larger the service, the greater the obligations. We summarize these obligations in the table here.
3.1 Liability of online platforms
The DSA will continue to apply the hosting,[4] caching,[5] and mere conduit[6] defences that first appeared in the Electronic Commerce Directive 2000.[7]
This includes prohibiting any general monitoring obligation from being imposed on these intermediary service providers and preserving the existing ‘notice and takedown’ process – whereby a hosting provider will only become liable for illegal content if, on obtaining actual knowledge of the illegality, they fail to remove or disable access to the content expeditiously.[8]
Nonetheless, the DSA draws a clearer line between the liability of online platforms and their liability under consumer law. Online platforms, such as marketplaces, will remain liable under consumer law when they lead an ‘average consumer’ to believe that the information, or the product or service that is the object of the transaction, is provided either by themselves or by a recipient of the service who is acting under their authority or control.[9] This will be the case, for example, where an online platform withholds the identity or contact details of a trader until after the conclusion of the contract concluded between the trader and the consumer, or where an online platform markets the product or service in its own name rather than in the name of the trader who will supply that product or service.[10]
When determining whether the online platform created such an appearance, the DSA refers to the concept of ‘average consumer’. This term was analysed by Advocate General Szpunar in the Louboutin case.[11] The Advocate General’s opinion suggests that the marketplace will be liable where a ‘reasonably well-informed and reasonably observant internet user’ perceives the offer of the trader as an integral part of the commercial offer of the marketplace.[12]
Intermediary service providers will also benefit from a new ‘Good Samaritan’ provision under which they will not lose these defences simply because they undertake own-initiative investigations to identify and remove illegal content.[13]
Similarly, where a provider automatically indexes information uploaded to its service, has a search function, or recommends information on the basis of the preferences of the users, it will not be a sufficient ground for considering that provider to have specific knowledge of illegal activities carried out on that platform or of illegal content stored on it.[14]
The preservation of the hosting defence and other intermediary protections is extremely welcome but will now be subject to significant new obligations under the DSA.
3.2 Providers of Intermediary Services
All intermediary service providers (including those only providing mere conduit and caching services) must comply with the following requirements:
3.3 Providers of Hosting Services
Hosting services are a subset of intermediary services consisting of the storage of information provided by or at the request of a user, such as cloud service providers, online marketplaces, social media, and mobile application stores.
In addition to the general intermediary service providers obligations discussed in the previous section 3.1, hosting providers are subject to additional obligations to:
The more important changes in the DSA apply to providers of ‘online platforms’. These are hosting providers who, on behalf of a user, store and disseminate information to the public, unless this is a minor and purely ancillary feature.[17] This includes social media services and online marketplaces (but is unlikely to include private messaging services).
Any attempt to regulate user-provided content is fraught with difficulties and raises difficult questions about the balance between fundamental rights to freedom of information, the impact of online harms and the practical limitations attempting to moderate content at scale.
The DSA appears to take a relatively non-interventionist stance. Save for large platforms (section 5 below) there are very limited obligations to police content on the platform. Instead, the new regime appears to have more of a bias to protect content by giving users a right to complain against the removal of content, and even use an out-of-court appeals process if they are unhappy with the platform’s handing of that complaint. This is a significant change for many platforms who will have to be much more transparent about their moderation processes and may need significant additional resources to deal with subsequent objections and appeals from users.
Alongside these changes are other significant new obligations, including:
While there are some exemptions for small and micro-enterprises, this will be a significant technical and operational lift for many organisations.
The highest tier of regulation applies to:
This designation brings with it some of the very strongest obligations in the DSA, considering the potential societal impact of such platforms. This includes obligations to conduct a risk assessment of their services and to take steps to mitigate any risks identified as part of that process.
To provide reassurance that appropriate measures have been taken, these entities must also create an independent compliance function and must commission an annual independent audit.
All of this is then backed up by stringent transparency requirements. This includes publishing the independent audit report referred to above alongside other information such as details of the amount of resource dedicated to content moderation. These very large providers must also provide information (including the design, logic the functioning and the testing of their algorithmic systems) to regulators and vetted researchers to allow them to monitor and assess compliance with the DSA.
Breach of the DSA comes with the now customary turnover-based fines. It can be punished by a fine of up to 6% of annual worldwide turnover, and users also have a right to compensation for any damage or loss suffered as a result of a breach.
The enforcement mechanism is not only limited to fines. The DSA also allows for enforcement both by national regulators, known as Digital Service Coordinators, and directly by the Commission. They will have the power to require immediate actions where necessary to address very serious harms, and platforms may offer commitments on how they will remedy them.
The Commission will have competence in respect of VLOP and VLOSE, with specific rights for the Commission to charge an annual supervisory fee of up to 0.05% of the provider’s worldwide annual net income to cover its supervisory activities.
For most entities, the obligations in the DSA will apply from the later of fifteen months after the DSA comes into force and 17 February 2024.
However, it might start to apply to VLOP and VLOSE from an earlier date as the DSA will apply to them four months after being designated as such by the Commission. Given the Commission is expected to start the designation process soon after adoption, the DSA is expected to apply to these entities in 2023.
While the DSA forms the cornerstone of the EU’s new content regulation framework, it is accompanied by a number of other new content regulation laws.
The DSA operates by putting in a baseline ’notice and takedown’ system. As set out above, hosting providers (including online platforms) must allow third parties to notify it of any illegal content it is hosting. Once notified, the hosting provider will need to remove that content expeditiously to continue to benefit from the hosting defence. Added to that, online platform providers must provide an expedited removal process for notifications from trusted flaggers, suspend users who frequently post illegal content and provide additional protection to minors.
Alongside these protections, VLOP and VLOSE have specific obligations to assess and mitigate ‘systemic risks’ arising from their services. That assessment must include the risks of or to:
The DSA is flanked by a number of specific instruments to strengthen and particularise the protections against online harms. For example, illegal content is also regulated by the Regulation on terrorist content online (Terrorist Content Regulation)[20] and the proposed Regulation on child sexual abuse (CSAM Regulation).[21] These set out a number of specific obligations such as an obligation to remove terrorist content in one hour[22] or the ability of authorities to issue detection order to require a provider to install and operate appropriate technology to detect known or new CSAM material[23].
There are also specific controls on political advertising in the proposed Regulation on the transparency and targeting of political advertising (Political Advertising Regulation).[24] Amongst other things this requires much greater transparency for political advertisements which must be identified as such and contain additional information such as the sponsor of the advertisement and the remuneration for showing the ad.
Finally, this framework will provide extra protection for recognised media sources through the proposed Regulation establishing a common framework for media services (European Media Freedom Act).[25] This requires VLOP to allow recognised media sources to declare their status and imposes additional transparency and consultation obligations on VLOP in relation to the restriction or suspension of content from those sources.
The emergence of content regulation laws is not just limited to the EU. There has been an explosion of new laws across the globe including India’s Intermediary Guidelines and Digital Media Ethics Code, Singapore’s Protection from Online Falsehoods and Manipulation Act, and the UK’s Online Safety Bill. Content regulation laws have even been proposed in the US, including the Texas House Bill 20 law, and Florida’s Stop Social Media Censorship Act. However, these laws have generally failed as a violation of the social media companies’ strong rights to freedom of speech under the First Amendment.
While societal factors (such as views on privacy and freedom of speech) have a significant impact on online harms laws, the EU’s position as a ‘regulatory supertanker’ means the DSA is likely to play a pivotal role in the development of other content regulation laws around the world.
The DSA will not apply to the UK. However, the UK has chosen to implement similar obligations by way of its Online Safety Bill, which was introduced to Parliament in May 2022. This includes placing significant emphasis on conducting risk assessments against a range of different harms.
The Online Safety Bill appears to take a more interventionist approach compared to the EU Digital Safety Act, including the ability of Ofcom (the UK’s communications regulator) to mandate the use of specific content moderation technology. The ‘bells and whistles’ attached to these laws are also very different. For example, the Online Safety Bill contains a range of collateral obligations such as giving users the option to verify their identity and imposing obligations on pornographic websites to block access by children.
The Online Safety Bill has, however, had a difficult passage with significant concerns as to whether it will interfere with freedom of speech. As part of the Conservative leadership campaign earlier this summer, one of the leadership contenders described the Bill as ‘no fit state to become law’ and the provisions to address lawful but harmful content as ‘legislating for hurt feelings’.[26] While the provisions on lawful but harmful material have been dropped to address these concerns, this illustrates the difficult and politically charged issues that arise when attempting to regulate user generated content online.
As more and more jurisdictions develop content regulation laws, they are likely to do so in ways that reflects their national traditions and attitudes, particularly the relative importance of free speech compared to privacy, safety and security. While there are, and will continue to be, common themes to this emerging area of law, the global picture looks like it will be fragmented and inconsistent. The obligations in the DSA are summarised in the table here.
This is an unedited version of the article published in the Global Privacy Law Review by Kluwer Law International. The full citation of the published version is: Peter Church & Ceyhun Necati Pehlivan, ‘The Digital Services Act (DSA): A New Era For Online Harms and Intermediary Liability’, (2023) 4 Global Privacy Law Review 53-59.
[1] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 Oct. 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act).
[2] The European Parliament and the Council of the European Union adopted the DSA on 5 Jul. 2022 and 29 Sep. 2022, respectively.
[3] Official Journal of the European Union, 65 L 277 1 (27 Oct. 2022), https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=OJ:L:2022:277:FULL&from=EN (accessed 1 Dec. 2022).
[4] A hosting service is defined as the storage of information provided by, and at the request of, a recipient of the service in accordance with Art. 3(g)(iii) of the DSA.
[5] A caching service is defined as the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request in accordance with Art. 3(g)(ii) of the DSA.
[6] A mere conduit service is defined as the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network in accordance with Art. 3(g)(i) of the DSA.
[7] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce).
[8] Art. 6(1), DSA.
[9] Art. 6(3), DSA.
[10] Recital 24, DSA.
[11] Opinion of Advocate General Maciej Szpunar (2 June 2022), Christian Louboutin v. Amazon, Joined Cases C‑148/21 and C‑184/21, ECLI:EU:C:2022:422, paras 65-72.
[12] Ibid., para. 101.
[13] Art. 7, DSA.
[14] Recital 22, DSA.
[15] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).
[16] Art. 27(2), GDPR.
[17] Art. 3(i), DSA.
[18] Directive 2005/29/EC of the European Parliament and of the Council of 11 May 2005 concerning unfair business-to-consumer commercial practices in the internal market and amending Council Directive 84/450/EEC, Directives 97/7/EC, 98/27/EC and 2002/65/EC of the European Parliament and of the Council and Regulation (EC) No 2006/2004 of the European Parliament and of the Council (Unfair Commercial Practices Directive).
[19] European Data Protection Board, Guidelines 3/2022 on Dark patterns in Social Media Platform Interfaces: How to Recognize and Avoid Them (adopted on 14 Mar. 2022).
[20] Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 Apr. 2021 on addressing the dissemination of terrorist content online.
[21] Proposal for a Regulation of the European Parliament and of the Council laying down rules to prevent and combat child sexual abuse (2022/0155(COD)).
[22] Art. 3, Terrorist Content Regulation.
[23] Art. 10, CSAM Regulation.
[24] Proposal for a Regulation of the European Parliament and of the Council on the transparency and targeting of political advertising (2021/0381(COD)).
[25] Proposal for a Regulation of the European Parliament and of the Council establishing a common framework for media services in the internal market (European Media Freedom Act) and amending Directive 2010/13/EU (2022/0277 (COD)).
[26] Sophie Morris, Tory Leadership Candidate Kemi Badenoch Says Online Safety Bill Is ‘In No Fit State To Become Law’, Sky News (14 July 2022), https://news.sky.com/story/tory-leadership-candidate-kemi-badenoch-says-online-safety-bill-is-in-no-fit-shape-to-become-law-12651674 (accessed 1 Dec. 2022).