The Digital Services Act (DSA) is a key part of a number of legislative initiatives put forward by the European Commission known as the Digital Services Package. These initiatives seek to create a safer digital space where the fundamental rights of all users of digital services are protected, and to create a level playing field which fosters innovation, growth and competitiveness across the EU and globally. Hermes Peraza and our Risk Consulting team explain the DSA below.

The DSA focuses on consumer protection, online content regulation and prevention of the illegal trade of goods. The regulation will apply to intermediary services offered to recipients of the service that have their place of establishment or are located in the EU, irrespective of where the providers of those intermediary services have their place of establishment.

The DSA was brought into force in November 2022 and has been directly applicable to Very Large Online Platforms (VLOPs) since 25 August 2023, and will apply to all other in-scope entities from 17 February 2024. 

DSA: key objectives and features

  • Increase the accountability of online platforms for illegal content on their service and give users enhanced mechanisms for reporting harmful content 
  • Increase transparency in online advertising, giving users greater control over how their personal data is used
  • Provide greater protection for children online by banning targeted advertising to children 
  • Will impact 1000s of online platforms (large and small) such as social media platforms, market places, and app stores. 
  • Minimise systemic risks (such as dissemination of illegal content, discriminatory content, and negative content relative to gender based violence) and ensure that appropriate controls are in place 
  • Provide a crisis response mechanism which allows for intervention by the European Commission in the event of threats to public health and security 
  • Penalties for breaches: up to 6% of annual global revenue (for some global platforms that could be up to $7billion)

10 key themes to address for DSA Compliance

01

Governance arrangements

Determine single points of contact, legal representatives, and compliance heads, communicate who they are and empower them. Also, work with your Board to establish DSA compliance duties and wider oversight structures including reporting frequency, air time to be provided to DSA compliance discussions and type and depth of Management Information (MI) and reporting to inform the Board and other governance fora on DSA risk and compliance matters.

02

Transparency to users

This includes updating terms and conditions, and communicating changes. For example, information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review. Firms should also take steps to ensure recommender system transparency and provide recommender system optionality.

03

Online Platform design and controls

Implement measures to prevent the manipulation of the recipients to make free and informed decisions. Firms should also implement transparent advertisement methods and take steps to protect minors and implement bans on targeted adverts to children and those based on special characteristics of users. In addition, online platforms should implement Know Your Business Clients (KYBC) controls and enable compliance by design, in particular in the case of online market places.

04

Mechanisms to counter illegal content

Develop notification and action mechanisms to allow individuals or entities to notify them of information that may be illegal. Also, establish processes to notify suspicions of criminal offences to relevant law enforcement or judicial authorities, implement trusted flaggers technical and organisational measures, and implement measures and protection against misuse, including suspension of services.

05

Appeals and Complaints

Implement an internal complaints-handling system and related processes following decisions to remove, disable, restrict access of information; suspend or terminate the provision of the service, suspend or terminate the recipients’ account; and suspend, terminate or otherwise restrict the ability to monetise information provided by the recipients. Online platforms must also respond to complaints logged effectively and efficiently; and inform users of out-of-court dispute settlement mechanisms.

06

Risk and Control Assessments

Very Large Online Platforms and Search Engines (VLOP’s/ VLOSE’s) must establish procedures to identify, analyse and assess any systemic risks stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services. This includes performing systemic risk assessments based on probability and severity; and implementing controls to mitigate systemic risks. To some extent, it is likely non-VLOPs/VLOSEs will also need to put similar processes in place

07

Compliance Management and Oversight

Very Large Online Platforms and Search Engines (VLOPs/ VLOSE’s) should also establish an independent Compliance Function with direct reporting line to the Board. This means that a compliance framework and operating model will likely be required to perform compliance oversight activities and processes. Consideration should be given to establishing a central compliance team as opposed to operating in silos, particularly in the case of global firms.

08

Crisis response, communication and learning

A crisis shall be deemed to have occurred where extraordinary circumstances lead to a serious threat to public security or public health in the EU or in significant parts of it. Online platforms should identify, assess and implement measures to prevent or eliminate serious threats; develop crisis protocols for addressing crisis situations, and report to the Commission (at regularly intervals) the implementation and qualitative and quantitative impact of the measures taken to mitigate serious threats.

09

Transparency Reporting, Data and MI

Online platforms should develop and publish annual content moderation transparency reports. For example: publish reports on any content moderation activity performed, e.g. number of notices submitted, number of complaints received. Online platforms should also establish processes that allow them to respond to data access requests from the Commission or Digital Services Coordinators.

10

Assurance and Remediation

Very Large Online Platforms and Search Engines (VLOPs/ VLOSE’s) should arrange an independent audit, at least annually, and implement any remediation measures.

How KPMG can help

Our team has deep technical expertise across all DSA related areas, including: implementing compliance functions, frameworks and operating models; implementing complaints handling and issue management systems and processes; designing and implementing Know Your Business Controls (KYBC); performing systemics risk assessments, performing independent reviews and audits.

Our capabilities include experts from IT Assurance, Risk Consulting, Technology Law, Algorithm Assurance, Privacy, Cybersecurity, and Forensic teams. In addition, our DSA services are powered by accelerators to ensure an efficient process. These include:

  • A global better practice DSA Audit criteria framework;
  • A DSA Compliance Assessment tool, developed and used for DSA compliance projects at other VLOPs / VLOSEs; 
  • A wealth of experience from other regulated sectors regarding the establishment of independent compliance functions; performing risk assessment; design and implementation of compliance controls, including many aspects of the DSA such as governance arrangements, KYBC controls, complaints handling, and related risk mitigation requirements; and,
  • An algorithm assurance methodology supporting the audit of DSA obligations in relation to your recommender and content moderation systems.

Contact our Risk Consulting team

Read more in Risk Consulting