Rachel de Souza, Philipp Schmechel, Radha Pull ter Gunne and Lorcan Moylan Burke | Privacy Matters | DLA Piper Data Protection and Privacy | DLA Piper https://privacymatters.dlapiper.com/author/lorcan-moylan-burke/ DLA Piper's Global Privacy and Data Protection Resource Tue, 15 Oct 2024 14:36:19 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif Rachel de Souza, Philipp Schmechel, Radha Pull ter Gunne and Lorcan Moylan Burke | Privacy Matters | DLA Piper Data Protection and Privacy | DLA Piper https://privacymatters.dlapiper.com/author/lorcan-moylan-burke/ 32 32 EU: CJEU Insight  https://privacymatters.dlapiper.com/2024/10/eu-cjeu-insight/ Tue, 15 Oct 2024 14:31:59 +0000 https://privacymatters.dlapiper.com/?p=7454 Continue Reading]]> October has already been a busy month for the Court of Justice of the European Union (“CJEU”), which has published a number of judgments on the interpretation and application of the GDPR, including five important decisions, all issued by the CJEU on one day – 4 October 2024. 

This article provides an overview and summary of several of the key data protection judgments issued by the CJEU this month. The judgments consider issues including: whether legitimate interests can cover purely commercial interests;  whether competitors are entitled to bring an injunction claim based on an infringement of the GDPR; what constitutes ‘health data’ within the meaning of Art. 4 and Art. 9 of the GDPR, whether a controller can rely on an opinion of the national supervisory authority to be exempt from liability under Art. 82(2) GDPR; and what constitutes sufficient compensation for non-material damages and many more. 

Following preliminary questions from the Amsterdam district court, the CJEU has provided valuable clarification in relation to whether “legitimate interests” under Art. 6 (1)(f) GDPR can be “purely commercial”. In its judgement, the CJEU recognized that a wide range of interests can be considered a ‘legitimate interest’ under the GDPR and there is no requirement that the interests of the controller are laid down by law. While the CJEU decided not to answer the specific preliminary questions received from the Amsterdam district court, the attitude of the CJEU is clear: “legitimate interests” can serve purely commercial interests.  

For further information on this decision, please see our blog post available here.  

In its judgement, the CJEU ruled that Chapter VIII of the GDPR allows for national rules which grant undertakings the right to take action in case of an infringement of substantive provisions of the GDPR allegedly committed by a competitor. Such an action would be on the basis of the prohibition of acts considered to be unfair competition. The CJEU further ruled, that the data of a pharmacist’s customers, which are provided when ordering pharmacy-only but non-prescription medicines on an online sales platform, constitute “health data” within the meaning of Art. 4 (15) and Art. 9 GDPR (to that extent contrary to the Advocate General’s opinion of 25 April 2024). 

For further information on this decision, please see our blog post available here.  

  • Maximilian Schrems v Meta Platforms Ireland Ltd (C-446/21) 

Background 

The privacy activist, Maximilian Schrems, brought an action before the Austrian courts challenging the processing of his personal data by Meta Platforms Ireland (“Meta”) in the context of the online social network Facebook. Mr Schrems argued that personal data relating to his sexuality had been processed unlawfully by Meta to send him personalised advertisements.   

Mr Schrems alleged that this processing took place without his consent or other lawful means under the GDPR. The CJEU noted that Mr Schrems had not posted sensitive data on his Facebook profile and further did not consent to Meta using a wider pool of personal data received from advertisers and other partners concerning Mr Schrems’ activities outside Facebook for the purpose of providing personalised advertising.  

The personalised advertisements in question were not based directly on his sexual orientation but on an analysis of his particular interests, drawn from a wider pool of data processed by Meta, as nothing had been openly published by Mr Schrems via Facebook about his sexuality. 

Key findings 

In its judgment, CJEU held that Art. 5(1)(c) GDPR does not allow the controller, in particular a social network platform, to process data collected inside and outside the platform for the purpose of personalised advertising for unlimited time and without distinction as to type of data. 

The CJEU emphasised that the principle of data minimisation requires the controller to limit the retention period of personal data to what is strictly necessary in the light of the objective of the processing activity. 

Regarding the collection, aggregation and processing of personal data for the purposes of targeted advertising, without distinction as to the type of those data, the CJEU held that a controller may not collect personal data in a generalised and indiscriminate manner and must refrain from collecting data which are not strictly necessary for the processing purpose. 

The CJEU also held that the fact that an individual manifestly made public information concerning their sexual orientation does not mean that the individual consented to processing of other data relating to their sexual orientation by the operator of an online social network platform within the meaning of Art. 9(2)(a) GDPR. 

Background 

The data subject is a shareholder of a company in Bulgaria. The company’s constitutive instrument was sent to the Registration Agency (Agentsia po vpisvaniyata), the Bulgarian authority managing the commercial register. 

This instrument, which includes the surname, first name, identification number, identity card number, date and place of issue of that card, as well as the data subject’s address and signature, was made available to the public by the Agency as submitted. The data subject requested the Agency to erase the personal data relating to her contained in that constitutive instrument. As it is a legal requirement to publish certain information relating to the company’s constitutive instrument in the commercial register under Directive 2017/1132 (relating to certain aspects of company law), the Agency refused to delete it when requested by the data subject. The Agency also did not want to delete the personal data that is not required under the Directive but was nevertheless published as it was contained in the instrument. The data subject brought an action before the Administrative Court of Dobrich (Administrativen sad Dobrich) seeking annulment of the Agency’s decision and an order that the Agency compensates her for the alleged non-material damage she suffered.  

 Key findings 

Of the in total eight questions asked by the national court, the CJEU answered six, of which five related directly to the GDPR. Firstly, the CJEU held that an operator of a public register, which receives personal data as part of the constitutive instrument that is subject to compulsory disclosure under EU law, is both a ‘recipient’ of the personal data insofar the operator makes it available to the public, and also a ‘controller’, even if the instrument contains personal data that is not required based on EU or member state laws for the operator to process. This does not change even if the Agency receives additional information because the data subject did not redact their personal data when sharing the constitutive instrument when they should have according to the operator’s procedural rules. 

Secondly, the controller managing the national register may not outrightly refuse any request of erasure of personal data published in the register using the argument that the data subject should have provided a redacted copy of the constitutive instrument. A data subject enjoys a right to object to processing and a right to erasure, unless there are overriding legitimate grounds (which is not the case here).  

Thirdly, the CJEU confirmed that a handwritten signature of a natural person is considered personal data as it is usually used to identify a person and has evidential value regarding the accuracy and sincerity of a document.  

Fourthly, the CJEU held that Art. 82(1) GDPR must be interpreted as meaning that a loss of control for a limited period by the data subject over their personal data, due to the making available to the public of such data online in the commercial register of a Member State, may be sufficient to cause ‘non-material damage’. What in any case is required, is that the person demonstrates that they actually suffered such damage, however minimal. The concept of ‘non-material damage’ does not require the demonstration of the existence of additional tangible negative adverse consequences.  

Lastly, if the supervisory authority of a member state issues an opinion on the basis of Art. 58(3)(b) GDPR, the controller is not exempt from liability under Art. 82(2) GDPR if it acts in line with that opinion. The Agency namely argued that a company’s constitutive instrument may still be entered into the register even if personal data is not redacted and referred hereby to an opinion of the Bulgarian supervisory authority. However, as such an opinion issued to the controller is not legally binding, it can therefore not demonstrate that damages suffered by the data subject are not attributable to the controller which means that it is insufficient to exempt the controller from liability.  

  • Patērētāju tiesību aizsardzības centrs (Latvia Consumer Rights Protection Centre) (C-507/23) 

Background 

The data subject is a well-known journalist and expert in the automotive sector in Latvia. During a campaign to make consumers aware of the risks involved in purchasing a second-hand vehicle, the Latvian Consumer Rights Protection Centre (“PTAC”) published a video on several websites which, among other things, featured a character imitating the data subject, without his consent.  

The journalist brought an action before the District Administrative Court in Latvia seeking (i) a finding that the actions of the PTAC, consisting in the use and distribution of his personal data without authorisation, were unlawful, and (ii) compensation for non-material damage in the form of an apology and the payment of EUR 2,000. The court ruled that the actions in question were unlawful, ordered the PTAC to end to acts, to make a public apology to the journalist and to pay him EUR 100 in compensation in respect of the non-material damage he had suffered. However, on appeal, although the Regional Administrative Court confirmed that the processing of personal data by the PTAC was unlawful and ordered the processing to cease and the publication of an apology on the websites which had disseminated the video footage, it dismissed the claim for financial compensation for the non-material damage suffered. The court found that the infringement that had been committed was not serious on the ground that the video footage was intended to perform a task in the public interest and not to harm the data subject’s reputation, honour and dignity.  

The journalist appealed this decision, and the Latvian Supreme Court referred a number of questions on the interpretation of Art 82(1) GDPR to the CJEU 

 Key findings 

Firstly, the CJEU found that an infringement of a provision of the GDPR, including the unlawful processing of personal data, is not sufficient, in itself, to constitute ‘damage’ within the meaning of Art. 82(1) GDPR.  

By this, the CJEU repeats and emphasises its previous interpretations of Art. 82(1) GDPR to the effect that a mere infringement of the GDPR is not sufficient to confer a right to compensation, since cumulatively and in addition to an ‘infringement’, the existence of ‘damage’ and of a ‘causal link between damage and infringement constitutes the conditions for the right to compensation in Art. 82(1) GDPR. According to the CJEU, this principle even applies if a provision of the GDPR has been infringed that grants rights to natural persons, as such an infringement cannot, in itself, constitute ‘non-material damage’. In particular, the CJEU held that the occurrence of damage in the context of the unlawful processing of personal data is only a potential and not an automatic consequence of such processing. 

Secondly, the CJEU found the presentation of an apology may constitute sufficient compensation for non-material damage on the basis of Art 82(1) GDPR. This applies in particular where it is impossible to restore the situation that existed prior to the occurrence of that damage, provided that that form of redress is capable of fully compensating for the damage suffered by the data subject. 

According to the CJEU, Art. 82(1) GDPR does not preclude the making of an apology from being able to constitute standalone or supplementary compensation for non-material damage provided that such a form of compensation complies with those principles of equivalence and effectiveness. In the present case, providing an apology as a possible compensation was explicitly laid down in Art. 14 of the Latvian Law on compensation for damage caused by public authorities. Other jurisdictions, however, such as German civil law, do not explicitly provide in their national laws the possibility of an apology as a form of compensation. Nevertheless, some courts have already taken apologies into account when determining the amount of monetary compensation. In light of this decision, courts may therefore consider an apology even more as a means of reducing the monetary amount of compensation for damages.  

Thirdly, according to the CJEU, Art. 82(1) GDPR precludes the controller’s attitude and motivation from being taken into account when deciding whether to grant the data subject less compensation than the damage actually suffered.  

According to the CJEU, Art. 82(1) GDPR has an exclusively compensatory and not a punitive function. Therefore, the gravity of an infringement cannot influence the amount of damages awarded under Art. 82(1) GDPR. The amount of damages may not be set at a level that exceeds full compensation for the actually suffered damage. 

Conclusion/implications 

While these five judgements were published on the same day, the decisions relate to a number of different topics. What they do have in common is that they all demonstrate the CJEU’s willingness to impose its reach and tackle difficult questions on the interpretation of the GDPR, particularly where there has not always been agreement or clarity among supervisory authorities. Although these decisions generally clarify and strengthen the CJEU’s previous interpretation of a number of issues, such as those relating to the compensation of non-material damages pursuant Art. 82(1) GDPR, it is interesting that for both the KLNTB decision and the Agentsia po vpisvaniyata decision, the CJEU followed a different interpretation of the GDPR to that of the relevant supervisory authorities (and in the KLNTB decision, contrary to the AG Opinion).

As we start to head into 2025, we can expect continued judgments from the CJEU on the interpretation and application of the GDPR with more than 20 pending cases with the CJEU relating to the GDPR.

]]>
Ireland: Increased regulatory convergence of AI and data protection: X suspends training of AI chatbot with EU user data after Irish regulator issues High Court proceedings https://privacymatters.dlapiper.com/2024/08/ireland-increased-regulatory-convergence-of-ai-and-data-protection-x-suspends-training-of-ai-chatbot-with-eu-user-data-after-irish-regulator-issues-high-court-proceedings/ Mon, 19 Aug 2024 12:23:43 +0000 https://privacymatters.dlapiper.com/?p=7414 Continue Reading]]> The Irish Data Protection Commission (DPC) has welcomed X’s agreement to suspend its processing of certain personal data for the purpose of training its AI chatbot tool, Grok. This comes after the DPC issued suspension proceedings against X in the Irish High Court.  The DPC described this as the first time that any Lead Supervisory Authority had taken such an action, and the first time that it had utilised these particular powers.

Section 134 of the Data Protection Act 2018 allows the DPC, where it considers there is an urgent need to act to protect the rights and freedoms of data subjects, to make an application to the High Court for an order requiring a data controller to suspend, restrict, or prohibit the processing of personal data.

The High Court proceedings were issued on foot of a complaint to the DPC raised by consumer rights organisations Euroconsumers, and Altroconsumo on behalf of data subjects in the EU/EEA. The complainants argued that the Grok chatbot was being trained with user data in a manner that did not sufficiently explain the purposes of data processing, and that more data than necessary was being collected. They further argued that X may have been handling sensitive data without sufficient reasons for doing so.

Much of the complaint stemmed from X’s initial approach of having data sharing automatically turned on for users in the EU/EEA, which it later mitigated by adding an opt-out setting. X claimed that it had relied on the lawful basis of legitimate interest under the GDPR, but the complainants argued that X’s privacy policy – dating back to September 2023 – was insufficiently clear as to how this applied to the processing of user data for the purposes of training AI models such as Grok.

This development follows a similar chain of events involving Meta in June. Complaints from privacy advocacy organisation NOYB were made against Meta’s reliance on ‘legitimate interest’ in relation to the use of data to train AI models. This led to engagement with the DPC and the eventual decision in June by Meta to pause relevant processing (without the need for the authority to invoke s134).

The DPC and other European supervisory authorities strive to emphasise the principles of lawfulness, fairness and transparency at the heart of the GDPR, and their actions illustrate that any activities which purport to threaten these values will be dealt with directly.

The DPC has previously taken the approach of making informal requests and has stated that the exercise of its powers in this case comes after extensive engagement with X on its model training. The High Court proceedings highlight the DPC’s willingness to escalate action where there remains a perceived risk to data subjects.

The DPC has, in parallel, stated that it intends to refer the matter to the EDPB although there has been no confirmation of such referral as of this date.

Such referral will presumably form part of a thematic examination of AI processing by data controllers. The topic is also the subject of debate from individual DPAs, as evidenced by the Discussion Paper on Large Language Models and Personal Data recently published by the Hamburg DPA.

The fact much of the high profile activity relating to regulation of AI is coming from the data protection sphere will no doubt bolster the EDPB’s recommendation in a statement last month that Data Protection Authorities (DPAs) are best placed to regulate high risk AI.

It is expected that regulatory scrutiny and activity will only escalate and accelerate in tandem with the increase in integration of powerful AI models into existing services by ‘big tech’ players to enrich data. This is particularly the case where it is perceived that data sets are being re-purposed and further processing is taking place. In such circumstances, it is essential that an appropriate legal basis is being relied upon – noting the significant issues that can arise if there is an over-reliance on legitimate interest. The DPC and other regulators are likely to investigate, engage and ultimately intervene where it believes that data subjects’ rights under the GDPR are threatened. Perhaps in anticipation of more cross-border enforcement activity, last month, the European Commission proposed a new law to  streamline cooperation between DPAs when enforcing the GDPR in such cases.

A fundamental lesson from these developments is that, in the new AI paradigm, ensuring there is a suitable legal basis for any type of processing and the principles of fairness and transparency are complied with should be an absolute priority.

]]>
Ireland: DPC Issues Record 87% of EU GDPR Fines in 2023; Breach Reports Increase by 20% https://privacymatters.dlapiper.com/2024/06/ireland-dpc-issues-record-87-of-eu-gdpr-fines-in-2023-breach-reports-increase-by-20/ Thu, 06 Jun 2024 12:23:06 +0000 https://privacymatters.dlapiper.com/?p=7337 Continue Reading]]>

The Data Protection Commission (DPC) has published its 2023 Annual Report, highlighting a record year with DPC fines accounting for 87% of all GDPR fines issued across the EU. A busy year for the DPC also saw a 20% increase in reported personal data breaches as Helen Dixon steps down after 10 years in the job, with Dr. Des Hogan and Dale Sunderland taking over the reins.

The past year has seen the DPC progress ongoing large-scale inquiries in particular against social media platforms, defend cross-border decisions in legal proceedings brought forward by appealing regulated entities and increase its interaction with the European Data Protection Board (EDPB). As a result, the DPC fines account for 87% of the GDPR fines issued by EU data protection authorities last year.

The DPC received a total of 6,991 valid notifications of personal data breaches in 2023, an increase of 20% against the previous year. The DPC also handled 43 complaints relating to alleged personal data breaches which were not notified to the DPC in line with Article 33.

Unauthorised disclosure of personal data continues to be the leading reason for breach notifications, accounting for 52% of the overall total in 2023. 146 of thevalid data breach notifications were received under the ePrivacy Regulations, an increase of 42% and 59 notifications in relation to the Law Enforcement Directive. In line with previous years, most incidents reported originate from the private sector (3,766), followed by the public sector (2968), with the remaining coming from the voluntary and charity sector (275).  

Complaints Handling

The Annual Report notes another year of extensive enforcement work by the DPC. In total, 11,147 cases were concluded by the DPC in 2023. As of 31 December 2023, the DPC had 89 statutory inquiries on-hand, including 51 cross-border inquiries. In addition to its cases and inquiries, the DPC also handled over 25,130 electronic contacts, 7,085 phone calls and 1,253 postal contacts. 

The Annual Report highlights that once again the most frequent GDPR topics for queries and complaints in 2023 were access requests; fair-processing; disclosure; direct marketing and right to erasure (delisting and/or removal requests).

Administrative Fines and Large-Scale Inquiries

The Annual Report highlights 19 inquiries that concluded in 2023 resulting in fines totaling €1.55 billion. From the tables below, what we see is a consistent enforcement strategy being implemented by the DPC focusing on international and domestic companies and their compliance with core principles of the GPDR (e.g. transparency, lawful basis, security measures) as well as targeted thematic focuses (e.g. children’s personal data and video surveillance).

Since the implementation of the GDPR, the DPC has been established as the Lead Supervisory Authority for 87% of cross-border complaints.

Notable large scale cross border inquiries that concluded in 2023 were:

Controller SectorFineIssues At Play
Social Media€5.5 millionController was not entitled to rely on contract as a lawful basis for service improvement and security under its terms and conditions.
Social Media€1.6 billionTransfer of data from the EU to the US without a lawful basis.
Social Media€345 millionProcessing of children’s personal data.

Notable domestic inquires that concluded in 2023 were:

Controller SectorFineIssues At Play
Financial Services€750,000Ten data breaches relating to the unauthorised disclosure of personal data on a customer facing app.
Healthcare€460,000A ransomware attack which impacted over 70,000 patients and their data, with 2,500 permanently affected when data was deleted with no back-up.
County Council€50,000Usage of CCTV, car plate reading technology and body worn cameras.

Ongoing Inquiries

The breadth and scale of the inquiries being undertaken by the DPC shows no signs of abating in its report. Notable inquires that have been progressed by the DPC include:

Controller SectorStatusIssues at play
Government DepartmentDPC is preparing a Statement of IssuesAllegation that the database used for the Public Services Card was unlawfully provided to the Department.
TechnologyDraft Decision with peer regulators for review (Art 60 GDPR)Processing of location data.
TechnologyDraft Decision with peer regulators for review (Art 60 GDPR)Compliance with transparency obligations when responding to data subjects.
Social MediaDPC has issued preliminary draft decisions in relation to four related inquiries.User generated data being posted on Social Media.
Social MediaDraft Decision with peer regulators for review (Art 60 GDPR)Transfer of data from EU to China
TechnologyDraft Decision with peer regulators for review (Art 60 GDPR)Real time bidding / adtech and data subject access rights.
Social MediaDPC is preparing its preliminary draft decisionAllegation of collated datasets being made available online.

Litigation  

At the outset of its Annual Report, the DPC recognizes the continued focus on domestic litigation before the Irish Courts. The DPC was awarded a considerable number of legal costs orders in 2023. The threat of a legal cost order may act as a deterrent to those considering challenging the DPC in the future.

There were 7 national judgments or final orders in 2023 split almost evenly between the Irish Circuit Court and the Irish High Court. The cases involved: 1 plenary matter, 5 appeals (with 4 statutory appeals and 1 appeal on a point of law) and 1 judicial review. 2 cases issued against the DPC were discontinued and a further 5 were concluded. The legal costs of 5 proceedings were awarded in favour of the DPC, with no reference to costs made in the reports for the other 2 proceedings. These awards enable the DPC to seek the legal costs it incurred in defending the proceedings against the claimant(s).

The DPC uses the Annual Report to showcase its supervisory and enforcement functions in relation to the processing of personal data in the context of electronic communications under the e-Privacy Regulations. The Annual Report highlights 4 successful prosecutions involving unsolicited marketing messages. In all 4 cases, the DPC had the legal costs of the prosecution discharged by the defendants, two of whom were companies in the telecommunications and insurance sectors.  

Children  

Prioritising the protection of children and other vulnerable groups forms one of the five core pillars to the DPC’s Regulatory Strategy 2022 – 2027, so it was no surprise that the DPC continued to be proactive in safeguarding children’s data protection rights this year. This is reflected in the list of matters that were prioritised for direct intervention by the DPC during 2023, which included CCTV in school toilets and posting of images of children online. The DPC issued a Final Decision and imposed a large fine of €345 million against a major social media company for infringements of GDPR related to the processing of personal data relating to children.

The DPC also produced guidance for organisations and civil society to enhance the protection of children’s personal data. An example of this is the data protection toolkit for schools, which was devised by the DPC after it noticed in the course of supervisory and engagement activities that the sectors was finding certain aspects of data protection compliance challenging.

Interestingly, the DPC has been nominated to represent the EDPB on the newly formed Task Force on Age Verification under the Digital Services Act and act as co-rapporteur in the preparation at EDPB level of guidance on children’s data protection issues. This leadership role follows the DPC’s publication of a guidance note on the Fundamentals of children’s data protection and the DPC’s enforcement activity in this area over recent years.

Data Protection Officers  

The DPC has continued its efforts to bring together the DPO community in Ireland, recognising the importance of the DPO’s role in data protection compliance for organisations. As at the end of 2023, the DPC has been notified of 3,520 DPOs. The DPC is actively engaging with DPO networks across a number of key sectors and has contributed to several events aimed at DPOs including a new course run by the Institute of Public Administration, ‘GDPR and Data Protection Programme for DPOs in the Public Service’.

Importantly, the DPC participated in the 2023 Coordinated Enforcement Framework (CEF) Topic ‘The Designation and Position of Data Protection Officers’. The DPC contacted 100 DPOs and identified three substantive issues in its national report:

  • Resources available to DPOs – a third of respondents noted they do not have sufficient resources to fulfill their role;
  • Conflicts of interests – over a third indicated their role is split with other core governance roles within their organisations; and
  • Tasks of the DPO – it was noted that many tasks of the DPO do not actually compliment the role of the DPO within many organisations.

Supervision  

A sectoral breakdown notes that of the 751 supervision engagements during 2023, 391 were from multinational technology companies. The DPC also provided guidance and observations on 37 proposed legislative measures.

Supervisory engagements undertaken by the DPC in 2023 included identifying data protection issues arising in the context of adult safeguarding and service provision to at-risk adults and an examination of the use of technology in sport and the processing of health data for performance monitoring (questionnaire due to issue to voluntary and professional sports).

The DPC also engaged with the Local Government Management Authority in relation to three draft codes of practice prepared in relation to the use of CCTV and mobile recording devices to investigate and prosecute certain waste and litter pollution related offences. Separately, given the significant increase in use of CCTV in areas of an increased expectation of privacy the DPC published a detailed update of its  CCTV Guidance in November 2023.

In February 2024, Helen Dixon stepped down from her role as Data Protection Commissioner and Dr. Des Hogan, who serves as Chairperson, and Mr. Dale Sunderland commenced their new roles.

The DPC continues to focus on systemic non-compliance and children’s data protection rights in 2024 as well as participating in the EDPB’s ongoing coordinated enforcement action on the right of access. With the level of enforcement action taking place as well as the rapid pace of AI and technology development, organisations are advised to review and update their privacy frameworks to ensure compliance with the GDPR. 

]]>
CJEU ruling clarifies data protection and e-privacy issues in the ad-tech space https://privacymatters.dlapiper.com/2024/03/cjeu-ruling-clarifies-data-protection-and-e-privacy-issues-in-the-ad-tech-space/ Wed, 13 Mar 2024 10:43:25 +0000 https://privacymatters.dlapiper.com/?p=7240 Continue Reading]]> Introduction

Identifiability; what can amount to personal data; and joint controllership are some of the issues addressed by the Court of Justice of the European Union (CJEU) in its recent judgment in the IAB Europe case (C-604/22). This case concerned the use of personal data for online advertising purposes and the use of real time bidding technology.

The CJEU’s judgment, delivered on 7 March 2024, is a result of IAB Europe’s appeal of a decision of the Belgian Data Protection Authority (Belgian DPA) regarding the Transparency and Consent Framework (TCF) and the IAB Europe’s role within it.

Background

IAB Europe is a non-profit association representing undertakings in the digital marketing and advertising sector at European level. It developed the TCF, which is an operational framework of rules intended to enable online publishers, data brokers and advertisers to obtain users’ consent and lawfully process their personal data.

The TCF is widely applied in the context of a real time auctioning system used to acquire advertising space for the display of targeted advertisements online. A key component of the TCF is the Transparency and Consent String (TC String).

The TC String is a combination of letters and characters which encodes and records user preferences through consent management platforms (CMPs), when they visit a website or app. The TC String is then shared with ad platforms and other participants of the ad-tech ecosystem; the CMP also places a specific cookie on the user device. When combined, the TC String and this cookie can be linked to the user’s IP address.

On 2 February 2022, the Belgian DPA held that the TC String amounts to personal data, that the IAB Europe qualifies as a data controller under the GDPR and that IAB Europe is in non-compliance with certain requirements of the GDPR as a result (for details see our blogpost at Belgian DPA decision on IAB Transparency and Consent Framework | Privacy Matters (dlapiper.com)).

IAB Europe contested the Belgian DPA decision, and the Brussels Court of Appeal referred two questions to the CJEU for a preliminary ruling:

  1. Whether a character string capturing user preferences in connection to the processing of their personal data constitutes personal data.
  2. Whether an organisation which proposes to its members a framework relating to the consent to the processing of personal data containing rules setting out how such personal data is to be stored or disseminated must be classified as a controller within the meaning of the GDPR.

The ruling

First question

Drawing from its previous rulings, the CJEU stated that the concept of personal data under Article 4(1) of the GDPR includes information resulting from the processing of personal data relating to an identified or identifiable person. It was noted that a string such as the TC String contains individual preferences of an individual user in relation to the processing of their personal data.

The CJEU concluded that, if the combination of a TC String with additional data, such as the user’s IP address, allows the user to be identified, then the TC String contains information concerning an identifiable user and constitutes personal data within the meaning of Article 4(1) of the GDPR.

The fact that IAB Europe cannot itself combine the TC String with the user’s IP address and does not have direct access to the data processed by its member does not change that conclusion.

The CJEU took the view that, subject to the verifications that are for the Brussels Court of Appeal to carry out, IAB Europe under the TCF has reasonable means allowing to identify an individual from a TC String by requesting its members to provide it with all information allowing it to identify the users whose data are subject of a TC String.

It follows from this that a TC String can constitute personal data within the meaning of Article 4(1) of the GDPR.

Second question

To address the second question, the CJEU built upon its previous judgments and stated that a natural or legal person exerting influence over the processing of personal data and, as result, participating in the determination of the purposes and means of the processing may be regarded as a controller within the meaning of Article 4(7) of the GDPR.

The CJEU confirmed again that the concept of joint controllership does not necessarily imply equal responsibility and does not require each joint controller to have access to the personal data concerned.

The CJEU took the view that IAB Europe as a sectoral organisation which makes available to its members a standard, appears to exert influence over the personal data processing operations when the consent preferences are recorded in a TC String and jointly determines, with IAB members, the purposes and means of those operations.

It follows that IAB Europe can, in certain instances, be regarded as a controller within the meaning of Article 4(7) of the GDPR.

The court clarified this point further, adding that a distinction must be drawn between the processing of personal data carried out by the members of IAB Europe, when the consent preferences of the users concerned are recorded in a TC String in accordance with the framework of rules established in the TCF, compared with the subsequent processing of personal data by operators and third parties on the basis of those preferences. Accordingly, the court was of the view that IAB Europe cannot be automatically regarded as controller in respect of subsequent data processing operations carried out by the third parties based on the preferences contained in the TC String, such as digital advertising or content personalisation, if IAB Europe does not exert an influence in the determination of either the purposes or the means of the processing.

Conclusion / implications

While not necessarily seismic or revelatory, the CJEU decision does bring welcome clarity on some longstanding data protection and e-privacy issues in the ad-tech space, in particular on the question of identifiability of individuals, the breadth of what can amount to personal data and the reach of joint controllership.

IAB Europe has welcomed the decision that “provides well-needed clarity over the concepts of personal data and (joint) controllership, which will allow a serene completion of the remaining legal proceedings“.

Next steps are for the matter to be assessed by the Brussels Court of Appeal and to issue a final determination. Until then, the Belgian DPA’s decision continues to remain suspended.

Despite all the prophecies of doom, we believe that the TCF will emerge stronger from this decision. This is because neither the questions submitted to the court nor the CJEU’s answers call the TCF into question. On the contrary, IAB Europe should be able to resolve the issue of joint controllership for the participants in the TCF at a technical level, especially since, according to the CJEU, joint controllership cannot automatically be assumed for subsequent processing operations on the basis of the preferences articulated via the TC String. Organisations should assess whether and how they are using the TCF and continue to keep developments in this judgment under review.

]]>
CJEU Insight https://privacymatters.dlapiper.com/2024/01/cjeu-insight/ Wed, 24 Jan 2024 11:18:40 +0000 https://privacymatters.dlapiper.com/?p=7197 Continue Reading]]> 2023 was a busy year for the Court of Justice of the European Union (CJEU), with the issuance of a number of far-reaching judgments on the interpretation and application of the GDPR.

In December 2023, the CJEU delivered two important decisions which supplement a growing body of jurisprudence on the issuance of administrative fines and claims for non-material damages.  

In Deutsche Wohnen C-807/21, the CJEU delivered effective guidance on the need to establish wrongdoing by a controller in order to impose a fine, while in Natsionalna agentsia za prihodite C-340/21, the CJEU has weighed in on the adequacy of a controller’s security measures and their exposure to claims for damages as a result.

Deutsche Wohnen

Background

On 5 December 2023, the CJEU delivered a judgment on the culpability of data controllers and the administration of fines by a supervisory authority for infringing the GDPR.

In this case, Deutsche Wohen, a German listed real estate company was fined by the Berlin Data Protection Authority approximately €14.5 million for the “intentional infringement” of the GDPR. The primary issue was Deutsche Wohen’s failure to delete personal data belonging to tenants when no longer necessary.

Deutsche Wohen brought an action against that decision which led to two fundamental questions being referred to the CJEU:

  1. To address a complex faceoff between German law and the GDPR on the liability of undertakings, the CJEU was asked whether an administrative fine can be issued under Article 83 GDPR against an undertaking without that infringement being first attributed to identified natural person (e.g., member of bodies or represent of the concerned undertaking)?
  2. The CJEU was asked whether an undertaking must have intentionally or negligently committed an infringement of the GDPR, or was the objective fact of the infringement suffice to impose a fine (i.e., is the undertaking strictly liable for the infringement)?

Key findings

Perhaps not surprisingly, in answering the first question, the CJEU held that the obligations and provisions of the GDPR do not permit the inference by Member States that the imposition of an administrative fine on a legal person as a controller is subject to a previous finding that that infringement was committed by an identified natural person.

In answering the second question the CJEU has provided some clear and direct guidance:

  • A function of administrative fines is to incentivise compliance with the GDPR. However, to do so, they do not need to be imposed in the absence of any wrongdoing.
  • Only infringements committed wrongfully (intentionally or negligently) can result in culpability and lead to a fine being imposed.
  • Nothing in the GDPR allows for Member States to deviate from this requirement and to effectively establish a strict liability regime.
  • Ignorance of an infringement is no defence.
  • It is not necessary to establish that a member of management acted intentionally, negligently, or was even aware of the infringement.
  • The concept of an undertaking is derived from EU competition law and that when a supervisory authority is calculating a fine to be imposed, they must do so on the basis of the percentage of the total worldwide annual turnover of the undertaking (group) in the preceding business year.

Natsionalna agentsia za prihodite

Background

On 14 December 2023, the CJEU delivered an important judgment on the conditions necessary to award compensation for non-material damage suffered by data subjects following a cyberattack.

The Bulgarian National Revenue Agency (NAP) is an authority attached to the Bulgarian Minister for Finance. Its function is to identify, secure and recover public debts. On 15 July 2019, it was revealed that a cyberattack had taken place on the NAP’s IT system leading to the unlawful dissemination of personal data of more than six million individuals, including both Bulgarians and foreigners.

A case was brought by an affected data subject against the NAP before the Bulgarian Administrative Court, seeking an order for compensation under Article 82 GDPR for the non-material damage suffered as a result of the fear that the data subject’s personal data may be misused in the future.

The case was referred to the CJEU by the Bulgarian Supreme Administrative Court seeking clarification on whether a person’s fear that their data may be misused in the future following unauthorised access due to a cyberattack amounts to non-material damage under Article 82 GDPR.

Key findings

  • The CJEU confirmed that such fear can constitute non-material damage under the GDPR. However, a national court must satisfy itself that the fear is genuine and well founded, having regard to the specific circumstances of the infringement and of the data subject.
  • The following factors were persuasive:
    • Article 82(1) GDPR establishes the right to compensation from the controller for the (non-material) damages.
    • The right of compensation requires three cumulative conditions to be met: (i) damage which has been suffered; (ii) an infringement of the GDPR; and (iii) a causal link between the damage and the infringement (as set out in the Austrian Post decision).
    • Once an infringement has been established, Article 82 GDPR cannot be interpreted as distinguishing between a scenario where the non-material damage suffered stems from actual misuse of personal data compared to where the damage stems from the fear over potential future misuse. In other words, the concept of non-material damage encompasses both.

Conclusion / implications

The Deutche Wohnen judgment is significant in that it develops the concept of culpability and wrongdoing and has thankfully provided long overdue clarity on whether Article 83 GDPR imposes a strict liability regime. The CJEU said that it does not.

Whereas from the NAP judgment, controllers must take account of not only the exposure to damages claims for tangible harm suffered due to a cyberattack but also the psychological distress that can be suffered from the fear of the misuse of compromised personal data. This case reifies the expression “better safe than sorry”. It elucidates the importance of having robust and state of the art technical and organisational measures in place. Controllers should consider both in tandem as controller exposure for infringing the GDPR can take form in both a fine imposed by a supervisory authority and an award for damages by a national court.

The two judgements, along with several other key CJEU decisions issued recently,[1] are a continuation of the CJEU beginning to impose its reach on controllers under the GDPR. The trickle up affect from the decisions of supervisory authorities and national courts to the CJEU is starting to bear fruit and over the course of 2024 we can expect a number of further important decisions from the CJEU on fundamental data protection issues.


[1] See for example, the Schufa case (C-634/21) and its impact on automated decision-making processes and the CJEU’s landmark decision in Meta vs Bundeskartellamt (C-252/21), where the CJEU imposed strict limitations on the use of the lawful bases of contractual necessity, legitimate interests and consent.

]]>