The EU Digital Services Act (DSA) is transforming how online platforms – from social media networks to online marketplaces – operate. It sets new rules for content moderation, transparency and data access, while reinforcing user rights and owner accountability.
Understanding how these obligations work in practice is complex and often challenging. This course helps you navigate the regulation and explains how it reshapes platform governance. You will learn about notice-and-action mechanisms, transparency reporting, systemic risk obligations and what compliance looks like in practice for platforms, researchers, regulators, and civil society.
The course explains the core concepts of the DSA and focuses on the operational reality of platform governance: how content moderation decisions are made, documented, challenged and audited – and why these requirements are challenging to implement at scale across heterogeneous systems such as marketplaces, social platforms and emerging generative services.
You will learn how the regulation of platforms under the DSA goes far beyond removing illegal content. The DSA introduces procedural safeguards and transparency obligations that make platform decisions contestable and reviewable. The course demonstrates how key transparency infrastructures – from the Statement of Reasons (SoR) database and platform transparency reports to advertising repositories – support accountability and enable external scrutiny of content moderation and platform governance.
The course also explains why data access is central to evidence-based enforcement and research under the DSA, including the vetted researcher pathway for systemic risk research and how this intersects with platform risk assessment and audit duties.
By the end of the course, you will be able to connect DSA concepts to concrete situations and processes and critically evaluate compliance claims using the DSA’s governance toolkit