Overview of the Digital Services Act (DSA)
The DSA regulates – amongst others – intermediary services such as online platforms, marketplaces, social media platforms, video platforms, app stores and online travel and accommodation platforms. The EU sets out to prevent illegal and harmful activities online and the spread of disinformation by ensuring user safety, by protecting fundamental rights, and by creating a fair and transparent online environment.
Key Goals of the DSA
- Better protection of fundamental rights
- More control and easier reporting of illegal content
- Stronger protection of children online
- Less exposure to illegal content
- Increased transparency in content moderation decisions
Organizations Required to Report
The DSA applies to all providers of intermediary services offering their services to users established or located in the EU. A specific group of those, i.e., Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) that individually reach 45+ million active monthly users in the EU, have a yearly external audit obligation to undergo an assessment for compliance with the DSA.
- In 2023, 17 online platforms qualified as VLOPs, and 2 search engines qualified as VLOSEsand had an audit obligation over audit year 2023/2024.
- In 2024, 6 additional VLOPs were designated as such under the DSA and have an audit obligation over audit year 2024/2025*.
* Temu (designated as of 31 May 2024) and XNXX.com (designated as of 10 July 2024) did not publish a DSA audit report as of December 2025 and were therefore not included in this publication
KPMG's Key Takeaways from the 2025 DSA Audit Reports
Overall audit opinions
1 VLOP (Stripchat) received a fully ‘positive’ opinion, 1 VLOP (Snapchat) received a ‘positive-with-comments’ opinion, and 19 VLOPs received ‘negative’ opinions. For Facebook and Instagram, no overall opinion was issued due to the significant number of ongoing European Commission (EC) investigations.
Opinions on obligation level
This year there is a clear overall progress on obligation level: positive opinions increased from 72% to 79%, while negative opinions nearly halved (from 11% to 6%).
Common Issues Identified
Several areas continue to present challenges for many platforms comparable to previous year, in particular Article 14 (Terms & Conditions), Article 16 (Notice & Action Mechanisms), and Articles 15, 24 and 42 (Transparency Reporting). In addition, obligations related to Content Moderation received the highest number of negative opinions across all articles this year.
Regulatory attention
Regulatory attention remains high. Supervisory focus continues to center on Articles 34–35 (systemic risk), with emerging pressure points for the coming year caused by new developments around Article 28 (protection of minors), Article 40 (researcher data access), and the Code of Conduct on Disinformation.