| Privacy Matters DLA Piper's Global Privacy and Data Protection Resource Tue, 30 Jan 2024 13:35:22 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif | Privacy Matters 32 32 Europe: EDPB coordinated enforcement action identifies areas of improvement to promote the role and recognition of DPOs https://privacymatters.dlapiper.com/2024/01/europe-edpb-coordinated-enforcement-action-identifies-areas-of-improvement-to-promote-the-role-and-recognition-of-dpos/ Tue, 30 Jan 2024 13:34:26 +0000 https://privacymatters.dlapiper.com/?p=7223 Continue Reading]]> Background

March 2023 saw the launch of the European Data Protection Board’s (EDPB’s) second coordinated enforcement action (CEF 2023), which focused on the designation and position of Data Protection Officers (DPOs). Data Protection Authorities (DPAs) across the EEA have launched coordinated investigations into this topic. In particular, DPA’s have been investigating whether DPOs have the position in their organisations required by Art. 37-39 GDPR and the resources needed to carry out their tasks.

On 17 January 2024, the EDPB adopted a report on the findings of supervisory authorities participating in the CEF 2023. In particular, the report analyses the challenges faced by DPOs and organisations that have designated a DPO, and how these may impact compliance with data protection laws. The report also includes recommendations that organisations, DPO’s and supervisory authorities may take into account to address these challenges.

Challenges faced by DPOs

Although the EDPB’s report recognises positive findings for many DPOs, it concludes that a number of DPOs still face obstacles, including:

  • an absence of designation of a DPO, even where appointment is mandatory;
  • insufficient resources allocated to the DPO;
  • insufficient expert knowledge and training of the DPO;
  • DPOs not being fully or explicitly entrusted with the tasks required under  data protection law;
  • conflict of interests and lack of independence of the DPO; and
  • a lack of reporting by the DPO to the organisations’ highest management level; and
  • a requirement for further guidance from supervisory authorities.

Recommendations to address these challenges

In order to address the challenges identified, the report lists recommendations for organisations, DPOs and DPAs, these include:  

  • encouraging DPAs to raise awareness amongst organisations of their obligation to appoint a DPO, through the promotion of existing guidance and enforcement actions, and providing further guidance, additional training materials and training sessions that could help a DPO navigate complex issues; and
  • encouraging organisations to ensure DPOs have sufficient resources to properly exercise their function and are given sufficient opportunities, time and resource to refresh their knowledge and learn about the latest developments.

EDPB conclusions

Despite the challenges identified in the report, the EDPB concludes that the overall results of the survey are encouraging, with the majority of DPOs confirming that they receive regular training and have the necessary skills and knowledge to do their job. However, the report emphasises the need to strengthen the role and recognition of DPOs, in order to ensure compliance with data protection laws.

The report also recognises that the role of the DPO seems to be changing in practice, with DPOs being tasked with key roles under new EU legislation  – introduced as part of the EU Data Strategy –  such as the AI Act, the Digital Services Act, the Digital Market Act and the Data Act. The EDPB concludes that organisations will need to consider how DPOs are tasked, utilised and supported, to ensure that these new roles avoid issues such as conflicts of interests or insufficient resources at the disposal of the DPOs.

The EDPB has confirmed that the CEF 2024 action will focus on the implementation of the right of access by data controllers.

]]>
EU: Significant new CJEU decision on automated decision-making https://privacymatters.dlapiper.com/2023/12/eu-significant-new-cjeu-decision-on-automated-decision-making/ Wed, 13 Dec 2023 09:15:54 +0000 https://privacymatters.dlapiper.com/?p=7166 Continue Reading]]> Authors: James Clark and Verena Grentzenberg

The Court of Justice of the European Union (CJEU) has delivered an important judgment on the scope and interpretation of the ‘automated decision-making’ framework under the GDPR.  It is a decision that could have significant implications for service providers who use algorithms to produce automated scores, profiles or other assessments that are relied upon by customers in a decision-making process.

Background

On 7 December the Court of Justice of the European Union handed down judgment in the Schufa case. 

Schufa AG (“Schufa”) is a (or the) leading German credit rating agency and holds information about almost 70 million individuals.  Amongst other things, it provides credit scores for German residents.  These scores are then relied upon by financial service providers to make lending decisions, such as offering mortgages or other loans.  Other customers of Schufa include retailers (online and stationary), telecommunication service providers, utility and transportation companies.

The case referred to the CJEU revolved around a German resident whose application for a loan was turned down by a German bank.  The bank’s decision was made primarily in reliance on a poor credit score assigned to that individual by Schufa.

The individual challenged Schufa and in particular requested that Schufa disclose information about its automated decision-making processes under Article 15(1)(h) GDPR.

By way of reminder, Article 22 GDPR restricts the taking of a decision about a data subject based solely on automated processing, where that decision produces legal effects concerning him or her or similarly significantly affects him or her.  Such a decision may only be taken under one of a limited number of grounds, and data subjects have an absolute right to contest the decision and obtain human intervention in the decision.

Article 15(1)(h) GDPR, meanwhile, is the component of the ‘right of access’ that allows a data subject to obtain, from the responsible controller, information about automated decision-making, including its ‘logic’ and its consequences.

Schufa rejected the assertion that it was responsible for automated decision-making, asserting that its role was to produce an automated score but that the relevant decision (whether to grant the loan) was taken by the third-party bank. 

Key Findings

The court rejected Schufa’s argument and held that the creation of the credit score was, itself, a relevant automated decision for the purposes of Article 22 GDPR.  This runs contrary to the previous received wisdom that only the ultimate decision-maker – in this case, the bank using the credit score to decide on the loan application – was engaging in automated decision-making.

The following factors were central to the court’s conclusion on this point:

  • The score produced by Schufa was considered to play a ‘determining role’ in the decision about whether to grant credit. 
  • The court adopted a broad interpretation of the term ‘decision’, finding that it could encompass ‘a number of acts which may affect the data subject in many ways.  Consequently, it did not matter that the ultimate decision about whether to grant credit was not taken by Schufa – there was a sufficiently close nexus between Schufa’s decision about what score to award and the subsequent credit decision.
  • Applying a purposive approach, the court also took into account the fact that Schufa was in a much better position that its customer to satisfy the Article 15 GDPR request and to provide meaningful information about the automated decision-making process, including its logic.

Implications

Businesses using algorithms or other automated processes to produce risk scores or similar outputs (for example, identity verification, fraud detection) are likely to be understandably concerned by the potential implications of this judgment.  In general, such companies have developed business models that assume the customer will bear the regulatory risk and responsibility associated with any decision taken using the company’s outputs. 

However, it is important that such companies read this judgment carefully and consider the ways in which their business models may be distinguished from those considered in Schufa.  For example:

  • To what extent does the company’s customer rely solely or predominantly on the provided output when making a decision?  If the output is one of only a number of factors taken into account by the customer, and in particular if the customer only attaches a moderate degree of weight / significance to this factor, then the circumstances may be sufficiently different. If not, it will be important that the company ensures that customers can rely on one of the exceptions to Article 22 GDPR, namely: explicit consent or necessity for a contract between the customer and the data subjects. Member State law can also provide for an authorisation, where such authorisation lays down “suitable measures” to safeguard the data subject’s rights and freedoms.
  • Is the ultimate decision one that has a legal or comparatively significant effect?  For example, a company may be specialised in producing automated marketing profiles / segmentations that are then relied upon by a customer to determine the marketing content to be sent to a consumer.  However, other than in limited special circumstances, it is unlikely that the decision about what marketing content to send to a consumer will constitute a ‘significant’ decision for Article 22 GDPR purposes. For example, in relation to Schufa, it is likely that many of Schufa’s customers do not use the credit scores provided for decisions that have a significant effect on the data subject – for example where the customer is an online shop and only uses the data to decide whether to request payment from a specific customer before or after delivery of their goods or services.

In a quirk of timing, we note that the Schufa judgment was handed down in the same week that the trilogue process around the EU AI Act concluded.  The use of AI systems to make decisions about the offering of credit is one of a number of ‘high risk’ use cases found in the Act.  Going forward, it looks likely that Schufa will become an important touchstone for businesses developing AI-enabled solutions that are relied upon by customers of those businesses in important decision-making processes.  

]]>