| Privacy Matters https://privacymatters.dlapiper.com/category/enforcement-2/ DLA Piper's Global Privacy and Data Protection Resource Wed, 12 Mar 2025 09:42:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif | Privacy Matters https://privacymatters.dlapiper.com/category/enforcement-2/ 32 32 CHINA: Recent Enforcement Trends https://privacymatters.dlapiper.com/2025/03/china-recent-enforcement-trends/ Wed, 12 Mar 2025 09:42:03 +0000 https://privacymatters.dlapiper.com/?p=7564 Continue Reading]]> Recently, the Cyberspace Administration of China (CAC), which is the primary data regulator in China, published a newsletter about the government authorities’ enforcement of Apps and websites that violated personal data protection and cybersecurity laws during the year 2024.

Based on the official statistics, during 2024, the CAC interviewed 11,159 website platforms, imposed warnings or fines on 4,046 website platforms, ordered 585 websites to suspend or update relevant functions, took down 200 Apps and took administrative actions on 40 mini-programs. The CAC also conducted joint enforcement actions together with the Ministry of Industry and Information Technology and revoked the licenses or shut down 10,946 websites and closed 107,802 accounts.

The following violations are of particular concern to these enforcement activities:

  • Failure to maintain relevant network logs as required by law or to promptly address security risks (such as system vulnerabilities), resulting in illegal and regulatory issues such as system attacks, tampering, and data leaks;
  • Failure to clearly display privacy notices in Apps, obtain necessary consent to process personal data, or provide convenient methods to opt out or de-register accounts;
  • Failure to conduct required recordal or filing for AI models or features built into Apps or mini-apps; and
  • Unreasonably requiring consumers to scan QR codes or perform facial recognition that is not necessary to provide the underlying services.

Around the same time, the National Computer Virus Emergency Response Center, which is an institution responsible for detecting and handling computer virus outbreaks and cyber attacks under the supervision of the Ministry of Public Security, published a list Apps that violated the personal data protection laws in the following areas:

  • Failure to provide data subjects with all the required information about the processing (e.g. name and contact details of the controller, categories of personal data processed, purposes of the processing, retention period, etc.) in a prominent place and in clear and understandable language; in particular, failure to provide such information about any third party SDK or plugin is also considered a breach of the law;
  • Failure to provide data subjects with the required details about any separate controller (e.g. name, contact information, categories of personal data processed, processing purposes, etc.) or to obtain the separate consent of data subjects before sharing their personal data with the separate controller;
  • Failure to obtain the separate consent of data subjects before processing their sensitive personal data;
  • Failure to provide users with the App functions to delete personal data or de-register accounts, or to complete the deletion or deregistration within 15 business days; or setting unreasonable conditions for users to de-register accounts;
  • Failure to formulate special rules for processing the personal data of minors (under the age of 14) or to obtain parental consent before processing the personal data of minors; and
  • Failure to take appropriate encryption, de-identification and other security measures, taking into account the nature of the processing and its impact on the rights and interests of data subjects.

The above enforcement focuses are also consistent with the audit points highlighted in the newly released personal data protection audit rules (see our article here). We expect the same enforcement trend to continue into 2025. Companies that process personal data in China or in connection with business in China are advised to review their compliance status with the requirements of Chinese law and take remedial action in a timely manner.

]]>
EU: DLA Piper GDPR Fines and Data Breach Survey: January 2025 https://privacymatters.dlapiper.com/2025/01/eu-dla-piper-gdpr-fines-and-data-breach-survey-january-2025/ Tue, 21 Jan 2025 11:53:17 +0000 https://privacymatters.dlapiper.com/?p=7534 Continue Reading]]> The seventh annual edition of DLA Piper’s GDPR Fines and Data Breach Survey has revealed another significant year in data privacy enforcement, with an aggregate total of EUR1.2 billion (USD1.26 billion/GBP996 million) in fines issued across Europe in 2024.

Ireland once again remains the preeminent enforcer issuing EUR3.5 billion (USD3.7 billion/GBP2.91 billion) in fines since May 2018, more than four times the value of fines issued by the second placed Luxembourg Data Protection Authority which has issued EUR746.38 million (USD784 million/GBP619 million) in fines over the same period.

The total fines reported since the application of GDPR in 2018 now stand at EUR5.88 billion (USD 6.17 billion/GBP 4.88 billion). The largest fine ever imposed under the GDPR remains the EUR1.2 billion (USD1.26 billion/GBP996 million) penalty issued by the Irish DPC against Meta Platforms Ireland Limited in 2023.

Trends and Insights

In the year from 28 January 2024, EUR1.2 billion fines were imposed. This was a 33% decrease compared to the aggregate fines imposed in the previous year, bucking the 7-year trend of increasing enforcement. This does not represent a shift in focus from personal data enforcement; the clear year on year trend remains upwards. This year’s reduction is almost entirely due to the record breaking EUR 1.2 billion fine against Meta falling in 2023 which skewed the 2023 figures. There was no record breaking fine in 2024.

Big tech companies and social media giants continue to be the primary targets for record fines, with nearly all of the top 10 largest fines since 2018 imposed on this sector. This year alone the Irish Data Protection Commission issued fines of EUR310 million (USD326 million/GBP257 million) against LinkedIn and EUR251 million (USD264 million/GBP208 million) against Meta.  In August 2024, the Dutch Data Protection Authority issued a fine of EUR290 million (USD305 million/GBP241 million) against a well-known ride-hailing app in relation to transfers of personal data to a third country. 

2024 enforcement expanded notably in other sectors, including financial services and energy. For example, the Spanish Data Protection Authority issued two fines totalling EUR6.2 million  (USD6.5 million/GBP5.1 million) against a large bank for inadequate security measures, and the Italian Data Protection Authority fined a utility provider EUR5 million (USD5.25 million/GBP4.15 million) for using outdated customer data.

The UK was an outlier in 2024, issuing very few fines. The UK Information Commissioner John Edwards was quoted in the British press in November 2024 as saying that he does not agree that fines are likely to have the greatest impact and that they would tie his office up in years of litigation. An approach which is unlikely to catch on in the rest of Europe. 

The dawn of personal liability

Perhaps most significantly, a focus on governance and oversight has led to a number of enforcement decisions citing failings in these areas and specifically calling out failings of management bodies. Most significantly the Dutch Data Protection Commission announced it is investigating whether it can hold the directors of Clearview AI personally liable for numerous breaches of the GDPR, following a EUR30.5 million (USD32.03 million/GBP25.32 million) against the company. This novel investigation into the possibility of holding Clearview AI’s management personally liable for continued failings of the company signals a potentially significant shift in focus by regulators who recognise the power of personal liability to focus minds and drive better compliance. 

Data Breach Notifications

The average number of breach notifications per day increased slightly to 363 from 335 last year, a ‘levelling off’ consistent with previous years, likely indicative of organisations becoming more wary of reporting data breaches given the risk of investigations, enforcement, fines and compensation claims that may follow notification. 

A recurring theme of DLA Piper’s previous annual surveys is that there has been little change at the top of the tables regarding the total number of data breach notifications made since the GDPR came into force on 25 May 2018 and during the most recent full year from 28 January 2024 to 27 January 2025. The Netherlands, Germany, and Poland remain the top three countries for the highest number of data breaches notified, with 33471, 27829 and 14,286 breaches notified respectively. 

AI enforcement

There have been a number of decisions this year signalling the intent of data protection supervisory authorities to closely scrutinise the operation of AI technologies and their alignment with privacy and data protection laws. For businesses, this highlights the need to integrate GDPR compliance into the core design and functionality of their AI systems.

Commenting on the survey findings, Ross McKean, Chair of the UK Data, Privacy and Cybersecurity practice said:

“European regulators have signalled a more assertive approach to enforcement during 2024 to ensure that AI training, deployment and use remains within the guard rails of the GDPR.”

We expect for this trend to continue during 2025 as US AI technology comes up against European data protection laws.

John Magee, Global Co-Chair of DLA Piper’s Data, Privacy and Cybersecurity practice commented:

“The headline figures in this year’s survey have, for the first time ever, not broken any records so you may be forgiven for assuming a cooling of interest and enforcement by Europe’s data regulators. This couldn’t be further from the truth. From growing enforcement in sectors away from big tech and social media, to the use of the GDPR as an incumbent guardrail for AI enforcement as AI specific regulation falls into place, to significant fines across the likes of Germany, Italy and the Netherlands, and the UK’s shift away from fine-first enforcement – GDPR enforcement remains a dynamic and evolving arena.”

Ross McKean added:

“For me, I will mostly remember 2024 as the year that GDPR enforcement got personal.”

“As the Dutch DPA champions personal liability for the management of Clearview AI, 2025 may well be the year that regulators pivot more to naming and shaming and personal liability to drive data compliance.”

]]>
UK: Data protection authority issues reprimand to gambling operator for unlawfully processing personal data https://privacymatters.dlapiper.com/2024/09/uk-data-protection-authority-issues-reprimand-to-gambling-operator-for-unlawfully-processing-personal-data/ Wed, 25 Sep 2024 15:04:20 +0000 https://privacymatters.dlapiper.com/?p=7435 Continue Reading]]> On 16 September 2024, the UK’s data protection authority, the Information Commissioner’s Office (ICO), issued a reprimand against Sky Betting and Gaming (SkyBet) for unlawfully processing people’s data through advertising cookies without their consent.

Between 10 January and 3 March 2023, SkyBet’s website dropped third-party AdTech cookies to visitors’ browsers before visitors could accept or reject them via a cookie banner. As a result, the visitors’ personal data (e.g., device information and unique identifiers) was shared automatically with third-party AdTech companies without visitors’ consent or a lawful basis. The cookies were deployed to allow advertising to be placed on other websites viewed by the visitor.

Whilst the ICO found no evidence of deliberate misuse of personal data to target vulnerable gamblers, it reprimanded SkyBet because it processed personal data in a way that was not lawful, transparent or fair.

This reprimand forms part of the ICO’s wider strategy to ensure that individuals’ rights and freedoms are respected. The ICO has recently reviewed the UK’s most-visited 100 websites and contacted more than half to warn of enforcement action. Many are reported to have implemented improvements, such as displaying a “reject all” button or presenting “accept all” and “reject all” options on an equal footing.

The ICO intends to assess the next 100 most-frequented websites and urges all organisations to assess their cookie banners to ensure freely given consent may be given. The ICO also intends to publish guidance on cookies and tracking technology before the end of the year.

DLA Piper advises all businesses on cookie compliance and is currently engaged by several businesses operating in the AdTech ecosystem, on assessing risk exposure and responding to ICO engagement. Should you wish to discuss this further, please reach out to your regular DLA Piper contact, or the authors of this blog.

]]>
Ireland: DPC Issues Record 87% of EU GDPR Fines in 2023; Breach Reports Increase by 20% https://privacymatters.dlapiper.com/2024/06/ireland-dpc-issues-record-87-of-eu-gdpr-fines-in-2023-breach-reports-increase-by-20/ Thu, 06 Jun 2024 12:23:06 +0000 https://privacymatters.dlapiper.com/?p=7337 Continue Reading]]>

The Data Protection Commission (DPC) has published its 2023 Annual Report, highlighting a record year with DPC fines accounting for 87% of all GDPR fines issued across the EU. A busy year for the DPC also saw a 20% increase in reported personal data breaches as Helen Dixon steps down after 10 years in the job, with Dr. Des Hogan and Dale Sunderland taking over the reins.

The past year has seen the DPC progress ongoing large-scale inquiries in particular against social media platforms, defend cross-border decisions in legal proceedings brought forward by appealing regulated entities and increase its interaction with the European Data Protection Board (EDPB). As a result, the DPC fines account for 87% of the GDPR fines issued by EU data protection authorities last year.

The DPC received a total of 6,991 valid notifications of personal data breaches in 2023, an increase of 20% against the previous year. The DPC also handled 43 complaints relating to alleged personal data breaches which were not notified to the DPC in line with Article 33.

Unauthorised disclosure of personal data continues to be the leading reason for breach notifications, accounting for 52% of the overall total in 2023. 146 of thevalid data breach notifications were received under the ePrivacy Regulations, an increase of 42% and 59 notifications in relation to the Law Enforcement Directive. In line with previous years, most incidents reported originate from the private sector (3,766), followed by the public sector (2968), with the remaining coming from the voluntary and charity sector (275).  

Complaints Handling

The Annual Report notes another year of extensive enforcement work by the DPC. In total, 11,147 cases were concluded by the DPC in 2023. As of 31 December 2023, the DPC had 89 statutory inquiries on-hand, including 51 cross-border inquiries. In addition to its cases and inquiries, the DPC also handled over 25,130 electronic contacts, 7,085 phone calls and 1,253 postal contacts. 

The Annual Report highlights that once again the most frequent GDPR topics for queries and complaints in 2023 were access requests; fair-processing; disclosure; direct marketing and right to erasure (delisting and/or removal requests).

Administrative Fines and Large-Scale Inquiries

The Annual Report highlights 19 inquiries that concluded in 2023 resulting in fines totaling €1.55 billion. From the tables below, what we see is a consistent enforcement strategy being implemented by the DPC focusing on international and domestic companies and their compliance with core principles of the GPDR (e.g. transparency, lawful basis, security measures) as well as targeted thematic focuses (e.g. children’s personal data and video surveillance).

Since the implementation of the GDPR, the DPC has been established as the Lead Supervisory Authority for 87% of cross-border complaints.

Notable large scale cross border inquiries that concluded in 2023 were:

Controller SectorFineIssues At Play
Social Media€5.5 millionController was not entitled to rely on contract as a lawful basis for service improvement and security under its terms and conditions.
Social Media€1.6 billionTransfer of data from the EU to the US without a lawful basis.
Social Media€345 millionProcessing of children’s personal data.

Notable domestic inquires that concluded in 2023 were:

Controller SectorFineIssues At Play
Financial Services€750,000Ten data breaches relating to the unauthorised disclosure of personal data on a customer facing app.
Healthcare€460,000A ransomware attack which impacted over 70,000 patients and their data, with 2,500 permanently affected when data was deleted with no back-up.
County Council€50,000Usage of CCTV, car plate reading technology and body worn cameras.

Ongoing Inquiries

The breadth and scale of the inquiries being undertaken by the DPC shows no signs of abating in its report. Notable inquires that have been progressed by the DPC include:

Controller SectorStatusIssues at play
Government DepartmentDPC is preparing a Statement of IssuesAllegation that the database used for the Public Services Card was unlawfully provided to the Department.
TechnologyDraft Decision with peer regulators for review (Art 60 GDPR)Processing of location data.
TechnologyDraft Decision with peer regulators for review (Art 60 GDPR)Compliance with transparency obligations when responding to data subjects.
Social MediaDPC has issued preliminary draft decisions in relation to four related inquiries.User generated data being posted on Social Media.
Social MediaDraft Decision with peer regulators for review (Art 60 GDPR)Transfer of data from EU to China
TechnologyDraft Decision with peer regulators for review (Art 60 GDPR)Real time bidding / adtech and data subject access rights.
Social MediaDPC is preparing its preliminary draft decisionAllegation of collated datasets being made available online.

Litigation  

At the outset of its Annual Report, the DPC recognizes the continued focus on domestic litigation before the Irish Courts. The DPC was awarded a considerable number of legal costs orders in 2023. The threat of a legal cost order may act as a deterrent to those considering challenging the DPC in the future.

There were 7 national judgments or final orders in 2023 split almost evenly between the Irish Circuit Court and the Irish High Court. The cases involved: 1 plenary matter, 5 appeals (with 4 statutory appeals and 1 appeal on a point of law) and 1 judicial review. 2 cases issued against the DPC were discontinued and a further 5 were concluded. The legal costs of 5 proceedings were awarded in favour of the DPC, with no reference to costs made in the reports for the other 2 proceedings. These awards enable the DPC to seek the legal costs it incurred in defending the proceedings against the claimant(s).

The DPC uses the Annual Report to showcase its supervisory and enforcement functions in relation to the processing of personal data in the context of electronic communications under the e-Privacy Regulations. The Annual Report highlights 4 successful prosecutions involving unsolicited marketing messages. In all 4 cases, the DPC had the legal costs of the prosecution discharged by the defendants, two of whom were companies in the telecommunications and insurance sectors.  

Children  

Prioritising the protection of children and other vulnerable groups forms one of the five core pillars to the DPC’s Regulatory Strategy 2022 – 2027, so it was no surprise that the DPC continued to be proactive in safeguarding children’s data protection rights this year. This is reflected in the list of matters that were prioritised for direct intervention by the DPC during 2023, which included CCTV in school toilets and posting of images of children online. The DPC issued a Final Decision and imposed a large fine of €345 million against a major social media company for infringements of GDPR related to the processing of personal data relating to children.

The DPC also produced guidance for organisations and civil society to enhance the protection of children’s personal data. An example of this is the data protection toolkit for schools, which was devised by the DPC after it noticed in the course of supervisory and engagement activities that the sectors was finding certain aspects of data protection compliance challenging.

Interestingly, the DPC has been nominated to represent the EDPB on the newly formed Task Force on Age Verification under the Digital Services Act and act as co-rapporteur in the preparation at EDPB level of guidance on children’s data protection issues. This leadership role follows the DPC’s publication of a guidance note on the Fundamentals of children’s data protection and the DPC’s enforcement activity in this area over recent years.

Data Protection Officers  

The DPC has continued its efforts to bring together the DPO community in Ireland, recognising the importance of the DPO’s role in data protection compliance for organisations. As at the end of 2023, the DPC has been notified of 3,520 DPOs. The DPC is actively engaging with DPO networks across a number of key sectors and has contributed to several events aimed at DPOs including a new course run by the Institute of Public Administration, ‘GDPR and Data Protection Programme for DPOs in the Public Service’.

Importantly, the DPC participated in the 2023 Coordinated Enforcement Framework (CEF) Topic ‘The Designation and Position of Data Protection Officers’. The DPC contacted 100 DPOs and identified three substantive issues in its national report:

  • Resources available to DPOs – a third of respondents noted they do not have sufficient resources to fulfill their role;
  • Conflicts of interests – over a third indicated their role is split with other core governance roles within their organisations; and
  • Tasks of the DPO – it was noted that many tasks of the DPO do not actually compliment the role of the DPO within many organisations.

Supervision  

A sectoral breakdown notes that of the 751 supervision engagements during 2023, 391 were from multinational technology companies. The DPC also provided guidance and observations on 37 proposed legislative measures.

Supervisory engagements undertaken by the DPC in 2023 included identifying data protection issues arising in the context of adult safeguarding and service provision to at-risk adults and an examination of the use of technology in sport and the processing of health data for performance monitoring (questionnaire due to issue to voluntary and professional sports).

The DPC also engaged with the Local Government Management Authority in relation to three draft codes of practice prepared in relation to the use of CCTV and mobile recording devices to investigate and prosecute certain waste and litter pollution related offences. Separately, given the significant increase in use of CCTV in areas of an increased expectation of privacy the DPC published a detailed update of its  CCTV Guidance in November 2023.

In February 2024, Helen Dixon stepped down from her role as Data Protection Commissioner and Dr. Des Hogan, who serves as Chairperson, and Mr. Dale Sunderland commenced their new roles.

The DPC continues to focus on systemic non-compliance and children’s data protection rights in 2024 as well as participating in the EDPB’s ongoing coordinated enforcement action on the right of access. With the level of enforcement action taking place as well as the rapid pace of AI and technology development, organisations are advised to review and update their privacy frameworks to ensure compliance with the GDPR. 

]]>
UK: How much will I get fined if I don’t comply? https://privacymatters.dlapiper.com/2024/03/uk-how-much-will-i-get-fined-if-i-dont-comply/ Wed, 20 Mar 2024 12:34:38 +0000 https://privacymatters.dlapiper.com/?p=7242 Continue Reading]]> Following the threat of significantly larger penalties since 2018 (the enhanced fines under the General Data Protection Regulation as compared to the legislation that went before), companies have asked us time and time again, “what is my financial risk for data protection non-compliance in the UK?”

The publication of the Information Commissioner Office’s new fining guidance offers some clarity on this question; including a published methodology the ICO will use to calculate any fine to impose.

We are pleased to have contributed to the shaping of aspects of the new guidance following our consultation submission, which has been published here.

About the guidance

In accordance with its statutory duty, the Information Commissioner’s Office (“ICO“) has published new data protection fining guidance (the “Guidance“) with the intention of mapping out how regulatory enforcement fines are to be calculated going forward. Whilst the headline fines under the GDPR are well understood by the market, the methodology previously deployed by the ICO was less clear and it is only following the passage of time that trend analysis could be undertaken on the ICO’s enforcement actions.

The new guidance provides welcome detail to help organisations place more confidence in their actions and the potential consequences of data protection risk decisions that may be taken.

When would a fine be considered?

The ICO has confirmed that when deciding whether to issue a penalty notice, it will review the facts of each case and consider:

  1. the seriousness of the infringement or infringements;
  2. any relevant aggravating or mitigating factors; and
  3. whether imposing a fine would be:
    1. effective;
    2. proportionate; and
    3. dissuasive.

Further details on each circumstance are set out below.

A. Seriousness: this is determined by a consideration of a number of factors broken down as follows.

  1. The nature (e.g. whether the standard or higher maximum fine is applicable?), gravity (i.e. the nature, scope and purpose of processing, the number of data subjects involved, and level of damage suffered) and duration of the infringement.
  2. Whether it was intentional or negligent.

    • Intentional: senior management authorised the unlawful processing; or the processing was undertaken despite advice about the risks involved, or with a disregard of its internal policies.
    • Negligent: whether a controller/processor has breached the duty of care required by law. Any assessment of a breach would include assessing evidence of the following factors:

      • failing to create data protection policies;
      • failing to read and abide by its existing data protection policies (or, where relevant, with a code of conduct or certification applicable);
      • human error, particularly where the person (or people) involved had not received adequate training on data protection risks;
      • failing to check for personal data in information that is published or otherwise disclosed; or
      • failing to apply technical updates in a timely manner.
  3. The categories of personal data affected. The ICO will consider infringements relevant to the processing of special category data, criminal convictions and offences data, and personal data falling within the definitions of ‘sensitive processing’, as particularly serious. The ICO also considers data categories likely to cause damage or distress to data subjects as particularly serious, such as: location data, private communications (intimate or confidential information), passport or driving licence details, or financial data.

The ICO acknowledges that assessing these various factors involves a degree of repetition, which it believes reflects the way the legislation is drafted and the fact that it needs to consider all relevant factors when: (i) deciding whether to impose a fine; and (ii) determining the amount of the fine.

B. Relevant aggravating or mitigating factors: Once the ICO has assessed the seriousness of the infringement, it will then consider whether there any aggravating or mitigating factors.

  • Mitigation – the ICO will be looking for evidence of the controller/processor having tried to effectively mitigate the harmful consequences of the infringement on the data subjects involved and the level of impact which that action had on the data subjects. The ICO will also give due consideration to measures in place prior to any investigation or the ICO otherwise becoming aware of the infringement.
  • Degree of responsibility – the ICO will consider the extent of what the controller/processor did considering its size and resources; and the nature and purpose of the processing. The ICO will be assessing any shared responsibility between controllers or between controllers and processors.
  • Previous infringement or measures previously ordered – the ICO will give greater weight to infringements which have been of a similar nature or infringements which occurred recently. The ICO will also have regard for compliance measures it has previously ordered concerning the same subject-matter.
  • Cooperation with the ICO – the starting point for cooperating is that controllers/processors are expected to cooperate with the ICO and should respond to requests for information where possible, therefore, performing the minimum is unlikely to be seen as a mitigating factor by the ICO. However, cooperating in a way that enables the enforcement process to be concluded more effectively; or significantly limits the harmful consequences for people’s rights and freedoms, will be viewed favourably.
  • How the ICO became aware – to what extent did the controller or processor notify the ICO about the infringement, this may be regarded as a mitigating factor if of its own volition and the ICO was previously unaware. This does not apply to statutory obligations to notify (e.g. Art. 33 UK GDPR). If the ICO finds out about an infringement from a complaint, the media or its own intelligence, this will usually be considered as a neutral point.
  • Codes of conduct or certification mechanisms – Adhering to approved codes of conduct or approved certification mechanisms will be given due regard. However, failure to meet the standards signed up to may be considered an aggravating factor.
  • Other aggravating or mitigating factors – economic or financial benefit obtained, or losses avoided as a result of the infringement. Also, and any action the controller/processor took pro-actively to report a breach to other appropriate bodies, such as the National Cyber Security Centre and whether any subsequent advice issued was followed.

We unsuccessfully argued in our submission to the ICO that the Guidance reads like there is an imbalance between aggravating and mitigating factors. For example, a demonstrable history of compliance (an unblemished record supported by evidence) was not accepted to be a mitigating factor, despite previous infringements being considered as an aggravating factor. Nevertheless, proactive technical and organisational measures in place would factor into other mitigating measures set out above.

C. Effectiveness, proportionality and dissuasiveness:

  1. To be effective, the fine should help ensure compliance with data protection legislation and/or providing appropriate sanctions for infringement;
  2. Proportionate means the fine does not exceed what is appropriate and necessary. It shows that the Commissioner has considered all the relevant circumstances, including:

    • the seriousness of the infringement,
    • the harm or other impact on data subjects, and
    • the size and financial position of the controller/processor; and
  1. To be dissuasive the fine should be a genuine deterrent to future non-compliance (both specific to the infringing controller/processor and generally as a message to the market).

For reasons of certainty, it is potentially unhelpful that the ICO has expressed its desire to maintain a significant degree of discretion at this stage of the fine setting process (both with respect to whether to impose a fine and the calculation of the level of the fine). Whilst the ICO states it will seek to ensure there is broad consistency, it remains to be seen how well this works in practice. Further, we have highlighted that the Guidance sets out that proportionality is a secondary analysis and only considered after it has been confirmed that the penalty would be effective and dissuasive. We submitted to the ICO that this represented a two-step process that went beyond the UK GDPR and could lead to unintended consequences.

Calculating the fine

If the decision is taken to issue a penalty notice, then the fine amount will be calculated by following five steps:

  1. Assessment of the seriousness of the infringement – looking at A. above, the ICO will determine a starting point for all fines based upon the seriousness of the infringement. The starting point will vary between:

    • serious infringements: the fine will be between 20% and 100% of the legal maximum;
    • infringements with a medium degree of seriousness: the fine will be between 10% and 20% of the legal maximum; and
    • infringements with a lower degree of seriousness: the fine will be between 0% and 10% of the legal maximum.
  2. Accounting for turnover – the ICO will then review the undertaking’s total worldwide annual turnover in its previous financial year (or where the controller / processor is not an undertaking, the ICO will review the assets, funding or administrative budget of the entity) and adjust the fine amount indicated by the calculation of the seriousness of the infringement. An undertaking that has an annual turnover of over £435 million is potentially exposed to fines of up to 4% of annual global turnover (so the statutory maximum).  The adjustment applied will mean that undertakings with a relatively low turnover are exposed to a mere fraction of the statutory maximum: for example, an undertaking with turnover of up to £2 million should receive a penalty of up to 0.4% of the sum indicated by the “seriousness of infringement” figure derived from Step 1.  The adjustment downward can be significant.
  3. Calculation of the starting point – the ICO will then calculate the starting point in one of two ways (depending on the outcomes of step 1 and step 2):

    • If the statutory maximum is a fixed amount, then: [statutory maximum amount (fixed)] x [adjustment for seriousness] x [turnover adjustment]; or
    • If statutory maximum is turnover based, then: [turnover] x [statutory maximum amount (percentage)] x [adjustment for seriousness].
  4. Adjustment to take into account any aggravating or mitigating factors – looking at Section B. above, the ICO will consider whether aggravating / mitigating factors should warrant an increase or decrease in the level of the fine.
  5. Assessment of whether the fine is effective, proportionate and dissuasive – looking at Section C. above, the ICO will also seek to ensure the fine does not exceed the statutory maximum amount.

In our submission, we proposed to the ICO to consider taking account of the Competition and Markets Authority’s (CMA) method of calculation for fines, where a specific step dedicated to settlement discounts is included. We suggested that the Commissioner adopts a similar stance to the CMA and permits organisations to engage in formal settlement discussions and permitting a discount for any settlement, where the infringing party admits its participation in the infringement. It is worth noting that the ICO welcomed the suggestion about introducing a formal settlement policy and offering a reduction in fines on that basis, and although it was ultimately outside of the Guidance, it will give look to mirror this approach in the future.

We were also troubled by the approach in the draft guidance as to how the ICO approached the concept of an undertaking.  We note that, in response to our submission, the Guidance is now much more detailed as to the approach that the ICO will take to determining whether a parent company has decisive influence over a subsidiary and therefore whether the turnover of the parent company itself should be taken into account.

We also note the amendment made by the ICO following consideration of the submissions to reflect that steps taken to mitigate damage to data subjects following a personal data breach are a mitigating factor when it comes to calculating the penalty.

Concluding remarks

Between 2019 and 2024 the fines issued by the ICO have varied significantly as compared to the value contained in the notice of intent provided by the ICO as compared to the final amount ultimately levied against the organisation. The Guidance now provides a clearer reference point which companies can refer to and overlay into their risk documentation – particularly where financial risk is assessed. This will help build out risk analyses and add further clarity on what amount of fine any data protection infringement discovered by an organisation could amount to. Though, as referenced above, there is still a residual challenge given the inherent discretion that remains.

Should you wish to discuss any matter contained within this article, please reach out to the authors or your regular data protection point of contact.

]]>
CJEU ruling clarifies data protection and e-privacy issues in the ad-tech space https://privacymatters.dlapiper.com/2024/03/cjeu-ruling-clarifies-data-protection-and-e-privacy-issues-in-the-ad-tech-space/ Wed, 13 Mar 2024 10:43:25 +0000 https://privacymatters.dlapiper.com/?p=7240 Continue Reading]]> Introduction

Identifiability; what can amount to personal data; and joint controllership are some of the issues addressed by the Court of Justice of the European Union (CJEU) in its recent judgment in the IAB Europe case (C-604/22). This case concerned the use of personal data for online advertising purposes and the use of real time bidding technology.

The CJEU’s judgment, delivered on 7 March 2024, is a result of IAB Europe’s appeal of a decision of the Belgian Data Protection Authority (Belgian DPA) regarding the Transparency and Consent Framework (TCF) and the IAB Europe’s role within it.

Background

IAB Europe is a non-profit association representing undertakings in the digital marketing and advertising sector at European level. It developed the TCF, which is an operational framework of rules intended to enable online publishers, data brokers and advertisers to obtain users’ consent and lawfully process their personal data.

The TCF is widely applied in the context of a real time auctioning system used to acquire advertising space for the display of targeted advertisements online. A key component of the TCF is the Transparency and Consent String (TC String).

The TC String is a combination of letters and characters which encodes and records user preferences through consent management platforms (CMPs), when they visit a website or app. The TC String is then shared with ad platforms and other participants of the ad-tech ecosystem; the CMP also places a specific cookie on the user device. When combined, the TC String and this cookie can be linked to the user’s IP address.

On 2 February 2022, the Belgian DPA held that the TC String amounts to personal data, that the IAB Europe qualifies as a data controller under the GDPR and that IAB Europe is in non-compliance with certain requirements of the GDPR as a result (for details see our blogpost at Belgian DPA decision on IAB Transparency and Consent Framework | Privacy Matters (dlapiper.com)).

IAB Europe contested the Belgian DPA decision, and the Brussels Court of Appeal referred two questions to the CJEU for a preliminary ruling:

  1. Whether a character string capturing user preferences in connection to the processing of their personal data constitutes personal data.
  2. Whether an organisation which proposes to its members a framework relating to the consent to the processing of personal data containing rules setting out how such personal data is to be stored or disseminated must be classified as a controller within the meaning of the GDPR.

The ruling

First question

Drawing from its previous rulings, the CJEU stated that the concept of personal data under Article 4(1) of the GDPR includes information resulting from the processing of personal data relating to an identified or identifiable person. It was noted that a string such as the TC String contains individual preferences of an individual user in relation to the processing of their personal data.

The CJEU concluded that, if the combination of a TC String with additional data, such as the user’s IP address, allows the user to be identified, then the TC String contains information concerning an identifiable user and constitutes personal data within the meaning of Article 4(1) of the GDPR.

The fact that IAB Europe cannot itself combine the TC String with the user’s IP address and does not have direct access to the data processed by its member does not change that conclusion.

The CJEU took the view that, subject to the verifications that are for the Brussels Court of Appeal to carry out, IAB Europe under the TCF has reasonable means allowing to identify an individual from a TC String by requesting its members to provide it with all information allowing it to identify the users whose data are subject of a TC String.

It follows from this that a TC String can constitute personal data within the meaning of Article 4(1) of the GDPR.

Second question

To address the second question, the CJEU built upon its previous judgments and stated that a natural or legal person exerting influence over the processing of personal data and, as result, participating in the determination of the purposes and means of the processing may be regarded as a controller within the meaning of Article 4(7) of the GDPR.

The CJEU confirmed again that the concept of joint controllership does not necessarily imply equal responsibility and does not require each joint controller to have access to the personal data concerned.

The CJEU took the view that IAB Europe as a sectoral organisation which makes available to its members a standard, appears to exert influence over the personal data processing operations when the consent preferences are recorded in a TC String and jointly determines, with IAB members, the purposes and means of those operations.

It follows that IAB Europe can, in certain instances, be regarded as a controller within the meaning of Article 4(7) of the GDPR.

The court clarified this point further, adding that a distinction must be drawn between the processing of personal data carried out by the members of IAB Europe, when the consent preferences of the users concerned are recorded in a TC String in accordance with the framework of rules established in the TCF, compared with the subsequent processing of personal data by operators and third parties on the basis of those preferences. Accordingly, the court was of the view that IAB Europe cannot be automatically regarded as controller in respect of subsequent data processing operations carried out by the third parties based on the preferences contained in the TC String, such as digital advertising or content personalisation, if IAB Europe does not exert an influence in the determination of either the purposes or the means of the processing.

Conclusion / implications

While not necessarily seismic or revelatory, the CJEU decision does bring welcome clarity on some longstanding data protection and e-privacy issues in the ad-tech space, in particular on the question of identifiability of individuals, the breadth of what can amount to personal data and the reach of joint controllership.

IAB Europe has welcomed the decision that “provides well-needed clarity over the concepts of personal data and (joint) controllership, which will allow a serene completion of the remaining legal proceedings“.

Next steps are for the matter to be assessed by the Brussels Court of Appeal and to issue a final determination. Until then, the Belgian DPA’s decision continues to remain suspended.

Despite all the prophecies of doom, we believe that the TCF will emerge stronger from this decision. This is because neither the questions submitted to the court nor the CJEU’s answers call the TCF into question. On the contrary, IAB Europe should be able to resolve the issue of joint controllership for the participants in the TCF at a technical level, especially since, according to the CJEU, joint controllership cannot automatically be assumed for subsequent processing operations on the basis of the preferences articulated via the TC String. Organisations should assess whether and how they are using the TCF and continue to keep developments in this judgment under review.

]]>
UK: Enforcement Against the Use of Biometrics in the Workplace https://privacymatters.dlapiper.com/2024/02/uk-enforcement-against-the-use-of-biometrics-in-the-workplace/ Thu, 29 Feb 2024 09:29:48 +0000 https://privacymatters.dlapiper.com/?p=7238 Continue Reading]]> The ICO has issued an enforcement notice which provides valuable insights into its approach to the use of biometrics in the workplace, and the lawfulness of employee monitoring activities more broadly.

On 23 February 2024, the Information Commissioner’s Office (“ICO”) ordered Serco Leisure Operating Limited (“Serco”), an operator of leisure facilities, to stop using facial recognition technology and fingerprint scanning (“biometric data”) to monitor employee attendance and subsequent payment for their time. Serco operates the leisure facilities on behalf of leisure trusts, some of which were also issued enforcement notices, as joint controllers.

Background

Serco introduced biometric technology in May 2017 within 38 Serco-operated leisure facilities. Serco considered that previous systems for monitoring attendance were prone to abuse, on the basis that manual sign-in sheets were prone to human error. Additionally, Serco found that manual sheets were abused by a minority of employees and further that ID cards were used inappropriately by employees. As a result, Serco considered that using biometric technology was the best way to prevent these abuses.

To support this assessment, Serco produced a data protection impact assessment (“DPIA”) and legitimate interest assessment (“LIA”). Within these documents, Serco identified the lawful bases for the processing of biometric data as Articles 6(1)(b) and (f) and the relevant condition for special category personal data as Article 9(2)(b) of the UK General Data Protection Regulation (“UK GDPR”).

Article 6(1)(b) was selected on the basis that Serco considered that operating the attendance monitoring system was necessary for compliance with the employees’ employment contracts. Article 6(1)(f) was selected in connection with Serco’s legitimate interests, which presumably related to the wider aims of the attendance monitoring system and the move to use biometric data, outlined above.

Serco selected Article 9(2)(b) on the basis that it considered that this processing was required for compliance with applicable laws relating to employment, social security and social protection. In particular, Serco considered that it needed to process attendance data to comply with a number of regulations, such as working time regulations, national living wage, right to work and tax/accounting regulations.

The contravention

Despite the above, the ICO believed Serco, as a controller, had failed to establish an appropriate lawful basis and special category personal data processing condition for the processing of biometric data. Serco had therefore contravened Articles 5(1)(a), 6 and 9 of the UK GDPR. The ICO had previously served Serco with a Preliminary Enforcement Notice in November 2023, giving Serco the opportunity to provide written representations, which the ICO considered in issuing the Enforcement Notice of 23 February 2024.

The ICO gave Serco three months from the date of the Enforcement Notice, to:

  • Cease all processing of biometric data for the purpose of employment attendance checks from the facilities, and not implement biometric technology at any further facilities; and
  • Destroy all biometric data and all other personal and special category data that Serco is not legally obliged to retain.

Key takeaways from the Enforcement Notice

  1. Processing must be necessary in order to rely on most lawful bases and special category personal data processing conditions.

The ICO emphasised that the processing of biometric data cannot be considered as “necessary” when less intrusive means could be used to achieve the same purpose.

It is not ordinarily necessary for an employer to process biometric data in order to operate an attendance monitoring system. It is of course necessary for employee attendance data to be processed, but this would not usually extend to biometric data.

It could perhaps be possible to argue that it is necessary to use biometric data in connection with attendance monitoring in an extreme case, but this would need to be based on specific circumstances. In this case, although Serco had considered that other less intrusive methods were subject to abuse, this consideration was not sufficient to justify use of biometric data on its own.

The ICO’s position was that Serco had not provided enough information to support its argument that eliminating abuse of the attendance monitoring system was a necessity, rather than simply a further benefit to Serco. There was a lack of evidence of consideration of alternative means of handling such abuse e.g. taking disciplinary action against the individuals responsible. The processing of biometric data was therefore not a targeted and proportionate way of achieving the purpose of verifying attendance.

  1. An appropriate balancing test must be conducted when relying on legitimate interest.

The ICO considered that in relying on its legitimate interests as a lawful basis, Serco did not give appropriate weight to the intrusive nature of biometric processing and the risks to the employees. Failure to give such appropriate weight meant that Serco could not rely on Article 6(1)(f).

Additionally, the ICO found that legitimate interests would not be regarded as an appropriate lawful basis where:

  1. The processing has a substantial privacy impact. In this instance, it was the regular and systematic processing of employee biometric data, which would entail a regular intrusion into their privacy over which they have no, or minimal control.
  1. Employees are not given clear information about how they could object or alternative methods of monitoring that did not involve intrusive processing. The fairness of processing, the availability and ease with which to exercise data subject rights and the provision of clear information are factors that should be taken into account when relying on legitimate interests and conducting an appropriate balancing test. The ICO highlighted that Serco had failed to process data fairly by not bringing the alternative mechanisms to the employees’ attention, even when an employee complained. There was also failure to process fairly as employees were not informed on how they could object to the processing.
  1. There is an imbalance of power between the employer and employees, such that employees may not have felt able to object (without detriment) even if they have been informed that they could.
  1. A specific legal obligation must be identified from the onset of processing in order to rely on Article 9(2)(b) UK GDPR.

In this instance, Serco had initially failed (including in its DPIA), to identify the specific obligation or right conferred by law on which it relied in reference to Article 9(2)(b) of the UK GDPR.

In this case, it may be that this omission was due to the fact that there is no such obligation or right conferred by law. Whilst there are legal obligations to record time and attendance data, health and safety obligations and requirements to manage the employment relationship, there are no specific legal obligations that would necessitate the processing of biometric data in connection with attendance monitoring.

In cases where there is a specific legal obligation or right conferred to process special category data (for example, in respect of the employer’s duty to make reasonable adjustments or to manage sickness at work), the ICO emphasised that it is not sufficient to simply select Article 9(2)(b) of the UK GDPR as the basis for processing. The controller must identify the specific obligation or right conferred by law and must have done so from the outset – before the processing of special category personal data commences.

It is also worth noting that, despite having conducted a DPIA and LIA, Serco could also not rely on this condition because Serco did not produce an appropriate policy document as required by Sch. 1 Para 1(1)(b) of the Data Protection Act 2018 (“DPA”) and had failed to demonstrate the necessity of processing biometric data (as referred to above).

4. The ICO will take account of infringement amplifiers.

In addition to biometric data being one that carries greater risk of harm, the length of time of processing without an appropriate lawfu-l basis (since 2017) and the number of data subjects involved (2,283), were also factors that the ICO considered as increasing the seriousness of the infringement.

Summary and conclusion

This decision does allow for the possibility to argue that use of biometric data is necessary, targeted and proportionate for attendance monitoring. However, as mentioned above, this would very much depend on the circumstances and the decision shows that this is likely to be the exception rather than the rule.

If an employer sought to rely on its legal obligations as a lawful basis for the processing, the controller would need to be in a position to show that the processing was now necessary to comply with these requirements. This would require it to provide evidence of widespread abuse and failure of other less intrusive methods. However even in these circumstances the employer would still need to consider fairness and proportionality in the operation of the system, as explained in this post.

It is possible for an employer to consider using employee consent as a basis under Article 9(2)(a) for processing biometric data in an attendance management system, given the limitations of Article 9(2)(b). However, as noted above, the imbalance of power in the employment relationship will act against the employer in relying on this basis unless there is a genuine ability for the employee to refuse using the system. In such a case, the operation of an alternative option to biometric data will be critical.

If an employer did wish to adopt biometric data processing for attendance monitoring systems, following this decision, we recommend that such an employer includes the following steps in the context of undertaking its DPIA, LIA and implementation processes:

  • Identify the appropriate lawful basis for the processing activity.
  • If the lawful basis relates to a specific obligation or right conferred by law, identify and document that law.
  • Consider whether the processing could be said to be necessary for the identified lawful basis and gather supporting evidence for this assessment, where relevant.
  • Provide employees with clear information regarding the processing, including information regarding data retention and use, as well as clear information regarding their right to object. This must be provided in advance of the system being implemented.
  • Undertake a full consideration of the fairness and proportionality of the processing, acknowledging that processing biometric data is extremely intrusive and carries significant privacy impacts for employees.
  • Provide employees an alternative option to participate in the attendance monitoring system should they object to the use of their biometric data and ensure that this is used in practice (meaning that there must always be another way to monitor attendance alongside the biometric data).
  • Ensure that an appropriate policy document is implemented, if relaying on a lawful basis under the UK GDPR that mandates this (e.g. Article 9(2)(b)).
]]>
California Attorney General Settles with DoorDash over Alleged Sale of Personal Information https://privacymatters.dlapiper.com/2024/02/california-attorney-general-settles-with-doordash-over-alleged-sale-of-personal-information/ Fri, 23 Feb 2024 01:17:57 +0000 https://privacymatters.dlapiper.com/?p=7231 Continue Reading]]> Overview

On February 21, 2024, the California Attorney General (CA AG) announced that it had reached a settlement with DoorDash over allegations that the company failed to comply with “sale” requirements under the California Consumer Privacy Act (CCPA) and disclosure requirements under the California Online Privacy Protection Act (CalOPPA). The settlement requires DoorDash to pay a $375,000 civil penalty and comply with specific injunctive terms.

The CA AG’s complaint alleges that DoorDash participated in marketing co-operatives (“co-ops”) that involved the company providing its customers’ personal information (such as names, addresses, and transaction histories) to the co-op without providing its customers with notice or an opportunity to opt-out of the sale. Upon receiving DoorDash’s customer personal information, the co-op would combine DoorDash’s customer data with the customer data of other third-party co-op members, analyze the data, and allow members to send mailed advertisements to potential leads. The CA AG considered such data disclosure a “sale” of personal information under the CCPA’s broad definition of that term. Specifically, DoorDash received “valuable consideration” in exchange for disclosing its customer data to the co-op, namely the “opportunity to advertise its services directly to the customers of the other participating companies.”

The CA AG’s second cause of action invoked CalOPPA, a 20-year-old California privacy law that imposes transparency obligations on companies that operate websites for commercial purposes and collect personally identifiable information from Californians. The complaint alleged violations of CalOPPA by DoorDash due to the company’s failure to disclose in its privacy policy that it would share its customers’ personally identifiable information with other third-party businesses (e.g., marketing co-op members) for those businesses to contact DoorDash customers with ads.

Key Takeaways

This settlement serves as a critical reminder of the importance of compliance with current and emerging state privacy laws, emphasizing the broad definition of “sale” under the CCPA and the strict requirements for transparency and consumer choice. Additionally, we expect the California Privacy Protection Agency, another California privacy regulator (vested with full administrative power, authority, and jurisdiction to implement and enforce the CCPA) to ramp up its own investigative and enforcement efforts this year. Thus, businesses should consider the following:

  • “Selling” is Broader than Cookies – companies should re-assess how their data disclosure activities may be considered “selling” under the CCPA. Many companies focus on the use of third-party ad and analytics cookies on their websites as the main trigger for “sale” compliance obligations under the law. This settlement makes clear that companies should broaden their review and assessment of their marketing department’s use of personal information to consider non-cookie related data disclosures.
  • Review and Update Privacy Policies – an outdated, unfair and deceptive, or misleading privacy policy serves as an online billboard announcing a company’s non-compliance with state privacy laws as well as state unfair competition laws (such as for example California’s Unfair Competition Law (UCL)). As this settlement demonstrates, this can be a magnet for consumer complaints and regulatory scrutiny (including at the federal level under Section 5 of the Federal Trade Commission Act). Companies should continually review and update their privacy policies if they materially change how they handle personal information. Under the CCPA, privacy policies must be updated at least annually.
  • Opt-Out Mechanisms. Companies should ensure that compliant opt-out mechanisms, including an interactive webform and a “Do Not Sell or Share My Personal Information” or “Your Privacy Choices” link, are in place. Opt-out mechanisms must also recognize and respond to universal opt-out preferences signals, such as the Global Privacy Control (GPC) signal.   
  • Don’t Forget the Apps – the complaint noted that both the DoorDash website and mobile application (App) failed to inform consumers about the sale of their personal information and their right to opt-out. Companies that collect personal information via an App and engage in “backend” selling of personal information should ensure that the App includes sufficient CCPA disclosures and a mechanism for users to easily opt-out of the sale of their personal information (see here for the CA AG’s previous announcements of an investigative sweep focused on violations of CCPA in the App context).
  • Marketing Co-Ops – this enforcement action makes clear the California regulators consider a company’s participation in a marketing co-operative to be a “sale” under the CCPA. Companies participating in marketing co-ops and other third-party data sharing engagements should carefully review their agreements with the data recipients to ensure they restrict the recipients’ ability to further disclose or sell consumer personal information.

For more information about these developments and the CCPA in general, contact your DLA relationship Partner, the authors of this blog post, or any member of DLA’s Data, Privacy and Cybersecurity team.

]]>
NETHERLANDS: Highest court side-steps determining whether legitimate interests may be purely commercial https://privacymatters.dlapiper.com/2022/07/netherlands-highest-court-side-steps-determining-whether-legitimate-interests-may-be-purely-commercial/ Thu, 28 Jul 2022 10:28:28 +0000 https://blogs.dlapiper.com/privacymatters/?p=3672 Continue Reading]]> On 27 July 2022, the highest administrative court in the Netherlands, published its highly anticipated judgment involving the Dutch Data Protection Authority’s assessment of “legitimate interest” under Article 6(1)(f) GDPR.

It was expected that the court would provide some clarification on whether “purely commercial interests” can qualify as legitimate interests within the meaning of Article 6(1)(f) GDPR with a potential to refer preliminary questions to the ECHR for clarification. Unfortunately, privacy professionals across Europe have been left empty handed. The court found that the controller had other legitimate interests which were not exclusively commercial that could be relied on. Hence, there was no need to consider the question of whether “purely commercial interests” could be a legitimate interest.

Background to the case

The appeal concerned a fine of EUR 575,000 issued by the Dutch Data Protection Authority ( the “Dutch Authority“) to VoetbalTV for unlawful processing. VoetbalTV is a video platform for amateur football. The company streams or records videos of matches on behalf of clubs and processes the personal data of (young) amateur footballers.

In its decision of July 2020, the Dutch Authority concluded that VoetbalTV’s processing of amateaur football’s personal data did not comply with Article 6(1)(f) GDPR since a legitimate interest could not be purely commercial in nature. In the Dutch Authority’s view, a legitimate interest cannot be broadly interpreted and should follow from the law.

VoetbalTV appealed the Dutch Authority’s decision and a lower Dutch court, ruled in favor of VoetbalTV in November 2020. The lower Dutch court confirmed that in its view, the Dutch Authority had applied the test for legitimate interest too narrowly. Yesterday’s case concerned the Dutch Authority’s appeal of this judgment.

The question of ‘purely commercial interests’

The judgment of the Dutch administrative court was of significant interest not just to Dutch domiciled companies but also controllers across Europe, because the first step of the (well-established) legitimate interests assessment[1] has not yet been considered from this angle.

The issue raised in this case was of such importance that the EU Commission recently published an open letter to the Dutch Authority setting out why in its view the Dutch Authority’s strict interpretation of Article 6(1)(f) was not inline with the GDPR, guidelines of regulators and the case of the CJEU.

Unfortunately, the Dutch administrative court was handed a ‘free pass’ in the form of additional legitimate interests raised by VoetbalTV which were not exclusively commercial in nature. Such interests were:

  1. the increase in the involvement and enjoyment of soccer fans
  2. the ability to perform technical analyses;
  3. offering friends and family members the opportunity to watch matches from a distance; and
  4. contributing to a higher level of privacy protection by preventing the recording of matches via other channels.

Therefore, the Dutch administrative court concluded in favor of VoetbalTV. It established that, with regard to the other interests stated by VoetbalTV there was now no question of a “purely commercial interest”. For this reason, the court was held that it did not have to answer the question and preliminary questions will not be referred to the CJEU.

What does this mean for those relying on commercial legitimate interests?

This much awaited ruling comes as somewhat of a disappointment, as it was hoped that it would bring some clarity on whether a purely commercial interest can be a “legitimate interest” for the purposes of Article 6(1)(f) GDPR.

Early alleged comments from the Dutch Authority imply that they still see the processing activities of VoetbalTV as unlawful, however, at the time of writing there has been no official commentary regarding the future of the Dutch Authority’s interpretation of “legitimate interests”.

In our view, controllers should note the open letter of the EU Commission in which the Commission highlighted that it was difficult to reconcile this strict interpretation with the intended effect of EU legislators. What is perhaps more significant is that the EU Commission reminded the Dutch Authority that just because a purely commercial interests is legitimate does not mean directly the controller can immediately rely on it – the second and third leg of the three-part legitimate interests test must apply.

Therefore, whilst controllers that relying legitimate interests as a lawful basis for processing should be continue to be clear and transparent about those interests, extra care should be taken to document the Legitimate Interests Assessment where those interests could be considered ‘commercial’.

For any questions relating to this decision or assistance with assessing legitimate interests, please contact Richard van Schaik, Partner or Francesca Pole, Associate – Data Protection – IPT Department DLA Piper Netherlands.

[1] Set out in Fashion ID No.C-40/17, ECLI:EU:C:2019:629

]]>