Digital Services Act (DSA)
https://en.wikipedia.org/wiki/Digital_Services_Act
The Digital Services Act[1] (DSA) is an EU regulation adopted in 2022 that addresses illegal content, transparent advertising and disinformation. It updates the Electronic Commerce Directive 2000 in EU law,[2][3] and was proposed alongside the Digital Markets Act (DMA).
The DSA applies to online platforms and intermediaries such as social networks, marketplaces and app stores.[4] Key requirements include disclosing to regulators how their algorithms work, providing users with explanations for content moderation decisions, and implementing stricter controls on targeted advertising. It also imposes specific rules on "very large" online platforms and search engines (those having more than 45 million monthly active users in the EU).
Objectives
Ursula von der Leyen proposed a "new Digital Services Act" in her 2019 bid for the European Commission's presidency.[6]
The expressed purpose of the DSA is to update the European Union's legal framework for illegal content on intermediaries, in particular by modernising the e-Commerce Directive adopted in 2000. In doing so, the DSA aims to harmonise different national laws in the European Union that have emerged to address illegal content at national level.[2] Most prominent amongst these laws has been the German NetzDG, and similar laws in Austria ("Kommunikationsplattformen-Gesetz") and France ("Loi Avia"). With the adoption of the Digital Services Act at European level, those national laws would be overridden and would have to be amended.[7]
In practice, this will mean new legislation regarding illegal content, transparent advertising and disinformation.
New obligations on platform companies
The DSA is meant to "govern the content moderation practices of social media platforms" and address illegal content.[8] It is organised in five chapters, with the most important chapters regulating the liability exemption of intermediaries (Chapter 2), the obligations on intermediaries (Chapter 3), and the cooperation and enforcement framework between the commission and national authorities (Chapter 4).
The DSA proposal maintains the current rule according to which companies that host others' data become liable when informed that this data is illegal.[8] This so-called "conditional liability exemption" is fundamentally different[9][10] from the broad immunities given to intermediaries under the equivalent rule ("Section 230 CDA") in the United States.
The DSA applies to intermediary service providers that offer their services to users based in the European Union, irrespective of whether the intermediary service provider is established in the European Union.[11]
In addition to the liability exemptions, the DSA would introduce a wide-ranging set of new obligations on platforms, including some that aim to disclose to regulators how their algorithms work, while other obligations would create transparency on how decisions to remove content are taken and on the way advertisers target users. The European Centre for Algorithmic Transparency was created to aid the enforcement of this.[12]
A December 2020 Time article said that while many of its provisions only apply to platforms which have more than 45 million users in the European Union, the Act could have repercussions beyond Europe. Platforms including Facebook, Twitter, TikTok, and Google's subsidiary YouTube would meet that threshold and be subjected to the new obligations.[13]
A 16 November 2021 Internet Policy Review listed some of new obligations including mandatory "notice-and-action" requirements, for example, respect fundamental rights, mandatory redress for content removal decisions, and a comprehensive risk management and audit framework.[14]
Companies that do not comply with the new obligations risk fines of up to 6% on their global annual turnover. In addition, the Commission can apply periodic penalties up to 5% of the average daily worldwide turnover for each day of delay in complying with remedies, interim measures, and commitments. As a last resort measure, if the infringement persists and causes serious harm to users and entails criminal offences involving threat to persons' life or safety, the Commission can request the temporary suspension of the service.[15]