In this article
EU Commission Publishes Proposal for Data Act
In February 2022, the EU Commission proposed a draft Data Act which aims to ensure fairness in a digital environment, stimulate a competitive data market, open opportunities for data-driven innovation and make data more accessible for all. Specific objectives include facilitating the access and use of data by consumers and businesses and putting in place safeguards against unlawful data transfers.
The following market participants will be subject to the Data Act:
- Devices that generate or collect data concerning their use or environment and are able to communicate that data via a public network. These products may include vehicles, home equipment, consumer goods, medical and health devices, and agricultural or industrial machinery;
- Data holders that make data available to data recipients;
- The digital services that are embedded in the Internet of Things (‘IoT’) product or connected with it and are necessary for an IoT product to perform one of its functions;
- Data recipients to whom data is made available;
- Public sector bodies and EU institutions agencies or bodies that request data holders to make data available to carry out tasks in the public interest; and
- Providers of data processing services like cloud services and edge computing providers.
Article 1(2) of the proposal limits the territorial scope of the IoT products and related services that are "placed on the market in the European Union," and of providers of data processing services "offering such services to customers in the European Union".
The proposal for the Data Act includes:
- Measures to allow users of connected devices to gain access to data generated by them, which is often exclusively harvested by manufacturers; and to share such data with third parties to provide aftermarket or other data-driven innovative services. It maintains incentives for manufacturers to continue investing in high-quality data generation, by covering their transfer-related costs and excluding the use of shared data in direct competition with their product.
- Measures to rebalance negotiation power for SMEs by preventing abuse of contractual imbalances in data sharing contracts. The Data Act will shield them from unfair contractual terms imposed by a party with a significantly stronger bargaining position. The Commission will also develop model contractual terms in order to help such companies to draft and negotiate fair data-sharing contracts.
- Means for public sector bodies to access and use data held by the private sector that is necessary for exceptional circumstances, particularly in case of a public emergency, such as floods and wildfires, or to implement a legal mandate if data isnot otherwise available. Data insights are needed to respond quickly and securely, while minimising the burden on businesses.
- New rules allowing customers to effectively switch between different cloud data-processing services providers and putting in place safeguards against unlawful data transfers.
- In addition, the Data Act reviews certain aspects of the Database Directive, which was created in the 1990s to protect investments in the structured presentation of data. Notably, it clarifies that databases containing data from Internet-of-Things devices and objects should not be subject to separate legal protection. This will ensure that they can be accessed and used.
The Draft Data Act and the Digital Economy
Undoubtedly, the much-anticipated Data Act demonstrates the EU Commission’s commitment to ensuring that EU law is keeping up with the changing Digital Economy. The draft Data Act aims to resist a business model in which data from Internet of Things (IoT) devices is locked in for use only by the manufacturer, or by a few players, making the data market more competitive.
It is a difficult balance as Data Privacy implications are inherent in relation to the sharing of data, particularly regarding the processing of data gathered from IoT devices.
The draft Data Act is likely to change over the next few months. The content confirms that the EU Commission is committed to its strategic vision on data, which is regarded as an “essential resource” for the Digital Economy. The EU has started to create a framework for organisations in the digital sector to facilitate innovation as well as building trust with customers and users.
Political agreement reached on Digital Services Act
EU institutions announced a political agreement on the final text for the Digital Services Act (‘DSA’). The legislation includes provisions for various prohibitions on targeted advertising, specifically, the targeting of minors and ads based on sensitive personal data. European Commission President, Ursula von der Leyen, said that the regulation "will upgrade the ground-rules for all online services in the EU." The DSA will immediately come into force once adopted but applies to platforms 15 months after its entry.
The DSA applies to online intermediary services. The organisations in scope are:
- Intermediary services offering network infrastructure: Internet access providers, domain name registrars;
- Hosting services such as cloud computing and webhosting services;
- Very large online search engines with more than 10% of the 450 million consumers in the EU, and that therefore, hold more responsibility in curbing illegal content online;
- Online platforms bringing together sellers and consumers such as online marketplaces, app stores, collaborative economy platforms and social media platforms; and
- Very large online platforms, with a reach of more than 10% of the 450 million consumers in the EU, which could pose particular risks to the dissemination of illegal content and societal harms.
The DSA outlines a number of measures, however these measures depend on the intermediary services’ role, size and impact in the online ecosystem. These include the following:
Measures to counter illegal goods, services or content online, such as:
- A mechanism for users to easily flag such content and for platforms to cooperate with so-called ‘trusted flaggers’; and
- New obligations on the traceability of business users in online market places.
New measures to empower users and civil society, including:
- The possibility to challenge platforms' content moderation decisions and seek redress, either via an out-of-court dispute mechanism or judicial redress;
- Provision of access to vetted researchers to the key data of the largest platforms and provision of access to NGOs as regards access to public data, to provide more insight into how online risks evolve; and
- Transparency measures for online platforms on a variety of issues, including on the algorithms used for recommending content or products to users.
Measures to assess and mitigate risks, such as:
- Obligations for very large platforms and very large online search engines to take risk-based action to prevent the misuse of their systems and undergo independent audits of their risk management systems;
- Mechanisms to adapt swiftly and efficiently in reaction to crises affecting public security or public health; and
- New safeguards for the protection of minors and limits on the use of sensitive personal data for targeted advertising.
Measure to enhance supervision and enforcement:
- Enhanced supervision and enforcement by the Commission when it comes to very large online platforms. The supervisory and enforcement framework also confirms important roles for the independent Digital Services Coordinators and Board for Digital Services.
In June, a majority of MEPs pushed back against the version of the Digital Services Act (DSA) that was published after the informal agreement was reached in April. However, the European Parliament published, on 15 June 2022, the text of the provisional agreement on the DSA. In addition, the Parliament confirmed, on 16 June 2022, that its Internal Market Committee had endorsed the provisional agreement on the DSA. The DSA will apply fifteen months after the entry into force or from 1 January 2024, whichever is later. However, very large online platforms and search engines will only have four months after adoption to comply with the obligations outlined in the DSA.
The European Commission will be directly involved with the supervision and enforcement of obligations for very large platforms reaching more than 10% of consumers in Europe. Fines can reach up to 6% of the global turnover of a service provider.
In June, the CNIL published a FAQ on Google Analytics, following on from the Google Analytics case. Some key points are as follows:
- Organisations, having received an order, have one month to comply;
- All Data Protection Authorities should share the same approach and position;
- Supplementary measures from Google Analytics (encryption and pseudonymisation) are not sufficient to address Schrems II, in this case;
- It is not possible to configure Google Analytics' settings so that no data is sent to the US;
- No supplementary measures can allow Google Analytics to be compliant, except with the use of a proxy under strict conditions (here the CNIL refers to another document). The CNIL said that pseudonymisation will be suitable as a supplementary measure if it ensures that all information transferred does not allow a re-identification of the person by Public Authorities, which with unique identifiers, is quite difficult; and
- The controller cannot adopt a "risk-based approach", based on the probability that the data can be accessed.
Review of Cookie Banners
Austrian privacy group NOYB (“None of Your Business”) has submitted 270 draft complaints to website operators who use cookie banners that do not comply with the GDPR. The batch is the second of a series of complaints by the NGO led by Max Schrems. NOYB’s battle against non-compliant cookie banners started in May 2021, when the organisation presented over 500 complaints. During the first round, 42% of the organisations attempted to make the relevant adaptations within one month; however, the overwhelming majority were still not compliant, according to the group. The activists are giving the companies a 60-day grace period to bring their cookie banners in line with GDPR requirements. NOYB said it will continue to scan and review cookie banners of up to 10,000 websites in the coming months.
CNIL and Cookie Walls
Cookie Walls ensure that access to a service on a webpage is dependent on the acceptance of the cookies on the device (computer, smartphone, etc.), by the Internet user.
CNIL's new guidance supports the view that Cookie Walls should be allowed if there is a "real and equitable alternative“. According to CNIL the alternative should be either :
(i) an own, tracking-free alternative or:
(ii) a tracking-free alternative on the market.
CNIL explicitly states that a paid option is an alternative, as long as the fee is "reasonable“ (or not so high that it would to deprive users of a real choice). The CNIL suggests that virtual wallets could be used. However, it is worth noting that CNIL stated that the determination of what is “reasonable” is subject to a case-by-case analysis.
The CNIL does state that necessity always needs to be taken into account, which means carefully choosing which cookies are covered by the consent generated by the Cookie Wall. The CNIL also clarifies with an example in this instance: “if a publisher considers that advertising cookies is necessary to remunerate the service, the publisher cannot block access if a user doesn't consent to personalisation”.
European Data Protection Board (‘EDPB’)
The EDPB published its Annual Report 2021
In May 2022, the EDPB published its Annual Report 2021. The Annual Report addresses the EDPB's activities in 2021, covering topics such as adopted guidance and opinions, as well as its involvement in various legislative consultations. In addition, the Annual Report emphasises that the EDPB continued to focus on international transfers of personal data and notes that in 2021, the EDPB adopted the final version of the recommendations on supplementary measures following Schrems (C-311/18) ('Schrems II'), the opinions on the UK draft adequacy decisions, and Guidelines on Codes of Conduct as tools for transfers.
Furthermore, the Annual Report outlines that another important area of focus in 2021 was digital policy, noting that as part ofthe framework of the EU's Digital Strategy, the European Commission put forward several proposals on which the EDPB and the European Data Protection Supervisor ('EDPS') issued legislative advice.
The Annual Report highlights that the EDPB's goals for 2022 include guidance on topics as varied as legitimate interest as a legal basis and the use of facial recognition by law enforcement authorities. Lastly, the Annual Report statesthat the EDPB will continue its work to optimise cooperation and enforcement.
EDPB Publishes Draft Guidelines on the Use of “Dark Patterns” in Social Media Interfaces
Social Media providers remain responsible and accountable for ensuring their platforms are in compliance with the GDPR. On 21 March 2022, the EDPB published draft Guidelines 3/2022 on Dark Patterns in Social Media Platform Interfaces. Dark patterns are defined as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data”. The objective of the Guidelines is to provide practical guidance to designers and users of social media platforms on how to identify and avoid these so-called “dark patterns” in social media interfaces that would violate the requirements set out in the GDPR. The Guidelines are intended to instruct organisations on how to design their platforms and user interfaces in a GDPR-compliant manner, as well as educate users on how certain practices they are subject to could run contrary to the GDPR.
These Guidelines are part of a broader trend in Europe and across other jurisdictions, whereby organisations are being held accountable where they offer websites, platforms or other consumer-facing interfaces in a way that could be interpreted as deceiving, manipulating, or unduly influencing consumers toward less privacy-protective choices. The Guidelines were open for consultation until 2 May 2022. They are expected to be updated and released in final form in the coming months.
The EDPB publishes finalised Guidelines on Codes of Conduct as data transfer tools
On 4 March 2022, the EDPB announced that it had published the final version of the Guidelines on Codes of Conduct as tools for transfers, following a public consultation launched in July 2021. In particular, the Guidelines highlight that under Article 46 of the GDPR, controllers and processors are required to implement appropriate safeguards for the transfers of personal data to third countries (i.e. a state that is not a member of the EEA) or international organisations. The Guidelines also outline that the GDPR diversifies the appropriate safeguards that may be used by organisations under Article 46 for framing transfers to third countries by introducing, among others, Codes of Conduct as a new transfer mechanism.
The Guidelines aim to provide clarification as to the role of the different actors involved in the setting of a code to be used as a transfer tool. Furthermore, the Guidelines, taking into account 'the Schrems II Case' and aiming to provide a level of protection consistent with alternative safeguards in accordance with Article 46 of the GDPR, sets out a checklist of elements to be covered by a Code of Conduct intended for transfers.
Spanish Data Protection Authority: Regulatory Code of Conduct for Processing of Personal Data in Field of Clinical Trials
The Code of Conduct, approved on 25 February 2022, regulates how the sponsors of clinical studies of medicine and the Contact Research Organisations that decide to adhere to it must apply the GDPR.
In addition, the AEPD (“Agencia Española de Protección de Datos”), the Spanish Data Protection Agency, provided that the objective scope of application of the Code of Conduct is the treatment of personal data in clinical investigations in general, and clinical trials, including those related to compliance with the obligations imposed by current regulations on the pharmaceutical industry for the detection and prevention of adverse effects of medicinal products already marketed.
The Code of Conduct provides detail on the responsibilities of each party involved in a clinical trial (such as the clinical trial sponsor and contract research organisation) and, addresses data protection impact assessments, codification of personal data, international data transfers, legal bases for processes, and data subject rights. A mediation procedure is also established by the Code of Conduct to support swift resolution of disputes in this area.
The AEPD emphasised that the Code of Conduct's scope of application is national, although it aspires to be a benchmark at a European level, as it is the first sectoral Code of Conduct that has been approved in Europe.
The EDPB publishes further Guidelines on Data Breach Notifications
The EDPB has published Guidelines on ‘Examples regarding Personal Data Breach Notifications’
The Guidelines provide examples of various breach scenarios, measures organisations should take, information on conducting risk assessments, appropriate risk mitigations and obligations in the event of a breach.
The Guidelines set out a number of scenarios including:
- ransomware attacks;
- misdirected communications to trusted third parties;
- highly confidential personal data sent by mail by mistake;
- lost or stolen devices and paper documents; and
- social engineering.
The Guidelines are intended to reduce the number of notifications being made to Data Supervisory Authorities where it is unlikely that there is a risk to the rights and freedoms of data subjects as a result of a breach.
The Guidelines emphasise that organisations should maintain a record of data breaches that do not meet the threshold for notification, including an explanation of the breach and the steps taken to mitigate the impact of the breach.
Consequences of the Clinical Trial Regulations (‘CTR’)
- Clinical trial submissions are to include how the process will adhere to national & European Data Protection laws via a DPO (Data Protection Officer) statement confirming they are in compliance.
- The Clinical Trial Regulations (‘CTR’) notes that consent can be used for processing of data subject data outside of the clinical trial for scientific purposes, however the EDPB took the view in Opinion 3/2019 that consent is not an appropriate legal basis in research settings given that there is a clear imbalance of power between the data subject & controller.
There is still some work to be done in aligning the regulations, however the key takeaways for DPOs are:
- Ensure there are processes in place to perform DPIAs on clinical trials for inclusion in clinical trial submissions; and
- In terms of using consent as a legal basis, the DPO should consider whether explicit consent should be sought for every form of processing (e.g. for trial participation, for data processing etc.).
Consumer groups can bring class actions for data protection infringements
The Court of Justice of the EU (‘CJEU’) has ruled that consumer protection associations may bring representative actions against big technology companies over GDPR infringements. The wider effect of the ruling will strengthen consumer groups’ rights, given that they are now deemed to fall within the scope of the organisations that can bring legal proceedings under the GDPR related to the protection of personal data.
In December 2021, the advocate general issued a non-binding opinion noting that, in a digitalised economy, personal data affects individuals in their capacity as consumers. Moreover, it would be paradoxical that a law intended to protect personal data reduced the protection of consumer rights.
The CJEU stated that these associations can put forth representative actions regardless of whether they have been mandated or not by one or more Data Subjects, as both cases are in line with the objective of Article 80(2) of the GDPR.
The CJEU stated that these associations can put forth representative actions regardless of whether they have been mandated or not by one or more Data Subjects, as in both cases both the personal and material requirements contained in Article 80 are satisfied (pursues a public interest purpose consisting of guaranteeing the rights and freedoms of data subjects as consumers; and the entity concerned 'considers' that the rights of a data subject under the GDPR have been infringed).
The Court added that in order to bring a class action, the entity representing the data subjects is not required to carry out the prior individual identification of the data subject affected by the data processing which is allegedly against the provisions of the GDPR. In addition, it is not necessary to allege a concrete breach of the rights conferred by the data protection rules, or the existence of actual damage suffered by the data subject as a result of the infringement of their rights.
‘UK SCCs’: IDTA, Addendum, and transitional provisions enter into force
The Information Commissioner’s Office (‘ICO’) announced in March 2022 an International Data Transfer Agreement (‘IDTA’), the International Data Transfer Addendum to the European Commission’s Standard Contractual Clauses (‘SCCs’) for international data transfers (‘the UK Addendum’):
- The UK Addendum: The UK Addendum only works alongside the EU SCCs. However, as it operates as an addendum to the EU SCCs, it does not address the shortcoming of the EU SCCs. The main shortcomings being that they do not cover all scenarios. The new EU SCCs cannot be used if the importer is directly subject to the UK GDPR on an extra-territorial basis, and they can only be used where the exporter/importer has an existing relationship with the new EU SCC’s modules (for example, there is no module that can be used if a processor transfers data to another processor who is not a sub-processor).
- The IDTA: The IDTA is a standalone agreement that is likely to be the way to go for organisations which are only UK-based and only process personal data to which the UK GDPR applies. The IDTA is a single, “one-size fits all” agreement, rather than taking the modular approach of the new EU SCCs – once the tables in the IDTA are completed (covering the parties’ details, transfer description, security requirements, any extra protection clauses which may be required and any commercial provisions which the parties may want added), it can be signed as is.
In both scenarios a transfer impact assessment must be carried out.
After 21 September 2022, organisations must use the IDTA or the UK Addendum if they want to enter into new arrangements for transfers which are subject to the UK GDPR. Furthermore, any existing arrangements for UK transfers based on the old EU SCCs must be replaced by 21 March 2024. This means that the new EU SCCs are not valid for existing arrangements, hence the IDTA or the UK Addendum should be implemented.
What’s the best tool for my organisation?
The ICO published the UK Addendum as a transfer mechanism to provide large multinational companies with an easy tool to allow for the transfer of personal data that is subject to both the EU GDPR and the UK GDPR. The UK Addendum can be used by companies that want to avoid having different mechanisms in place, as this document can be used for both types of data flows. Organisations that want to fold the UK Addendum provision into wider group data transfers will have to amend the document, however minor changes will be required as the table format can be altered and the UK-specific signatures within the document are optional.
If organisations only require the safeguarding of UK data, the document will not need any amendments and can be used as it is.
However, as the new EU SCCs are coming into force for existing arrangements in December 2022, organisations that want to use the UK Addendum as a transfer tool will have to be implemented prior to the December 2022 deadline.
The IDTA may be used even if the importer is subject to the UK GDPR, this will mean that the sections containing UK GDPR obligations can be removed as they will already be applied directly to the importer.
It is worth noting that, unlike the new EU SCCs, the IDTA does not include requirements under Article 28 of the UK GDPR, hence a separate data processing agreement will need to be signed.
Transfer Impact Assessments (‘TIA’)
A TIA must be carried out regardless of the transfer tool chosen by the organisation. A TIA must be carried out before the transfer is approved and completed. This TIA is very similar to that required by the EU SCCs.
The ICO has updated its guidance and uploaded the addendums to its website. You can click here for more information.
‘Data Reform Bill announced in 2022 Queen's Speech
Following its departure from the EU, the UK government introduced a Bill implementing the GDPR. However, in 2020 the Prime Minister announced his intention to deviate from the EU framework.
During this year’s Queen’s Speech, the formal opening of Parliament, in which the monarch sets out the government’s legislative plans for the upcoming year, the intention to initiate a reform of the current UK data protection framework was announced.
The 'Queen’s Speech 2022: background briefing notes' ('the Briefing Notes') outlines that the purpose of the Bill is to:
- Take advantage of the benefits of Brexit to create a world class data rights regime that will allow for the creation of a new pro-growth and trusted UK data protection framework to promote innovation and to improve the lives of people in the UK;
- Modernise the Information Commissioner's Office ('ICO'), making sure it has the capabilities and powers to take stronger action against organisations who breach data rules while requiring it to be more accountable to Parliament and the public; and
- Increase industry participation in Smart Data Schemes, which will give citizens and small businesses more control of their data, and help those who need healthcare treatments, by helping improve appropriate access to data in health and social care contexts.
Benefits of the Bill
The Briefing Note also outlined the potential advantages of the Bill:
- Increasing the competitiveness and efficiencies of UK businesses by reducing the burdens they face;
- Making sure that data can be used to empower citizens and improve their lives, via more effective delivery of public healthcare, security, and government services;
- Creating a clearer regulatory environment for personal data use that will fuel responsible innovation and drive scientific progress;
- Ensuring that the regulator takes appropriate action against organisations who breach data rights and that citizens have greater clarity on their rights; and
- Simplifying the rules around research to cement the UK's position as a science and technology superpower.
Finally, the Briefing Notes outline that the Bill will mainly have territorial extent and application with some measures extending and applying to England and Wales only.
The UK Adequacy Decision
The EU Adequacy Decision will expire in 2024. Before its renewal, an assessment will be completed on whether or not the UK has kept in place comparable standards. The introduction of the data protection reform could have a direct effect on the renovation of the Adequacy Decision. However, details about what the reform entails are yet to be released.
Consultation on the reform
On 17 June 2022, the Government response to the consultation on proposals to reform the UK data regime, titled Data: A New Direction, was published. In particular, the response reiterates the Government's intention to establish the UK as a global data marketplace, building upon the National Data Strategy.
Get in touch
If you have any queries on the topics covered in this issue of Data Privacy Matters, please contact Tom Hyland of our Risk Consulting practice. We'd be delighted to hear from you.