Content Regulation in the European Union: The Digital Services Act
Über dieses Buch
Illegal and (lawful, but) harmful content – most notably hate speech and fake news, but also violent videos, copyright infringement, or child pornography – is a crucial problem on digital platforms like Facebook, YouTube, TikTok and Twitter. The EU’s 2022 Digital Services Act aims at tackling this problem by introducing an updated horizontal framework for all categories of content and activities on intermediary services. This raises several questions. How far do – national and European – free speech guarantees go? If hate speech can be banned to protect the victims’ rights, how can the prohibition of fake news be justified? What is the remaining leeway of the platforms for private content moderation? Who is responsible for fighting and taking down illegal content? How can the victims of de-platforming, content takedowns or shadow banning claim their right to freedom of opinion? Finally, how will these legal responsibilities be enforced? These questions are addressed in the articles of the edited volume, proceeding from the 2022 Annual Conference of the Institute for Digital Law Trier (IRDT).
Übersicht der Beiträge:
The Digital Services Act: Introduction and Overview | Lea Katharina Kumkar | p. 1
Potentials and Limits of Filter Technology for the Regulation of Hate Speech and Fake News | Martin Steinebach | p. 13
Freedom of Speech goes Europe - EU Laws for Online Communication | Antje von Ungern-Sternberg | p. 27
Taking or Escaping Legislative Responsibility? EU Fundamental Rights and Content Regulation under the DSA | Mattias Wendel | p. 59
The Digital Services Act: A General Assessment | Florence G’Sell | p. 85
Impacts of the Digital Services Act on the Facebook „Hate Speech“ decision by the German Federal Court of Justice | Ruth Janal | p. 119