Rachel De Souza | Privacy Matters | DLA Piper Data Protection and Privacy | DLA Piper https://privacymatters.dlapiper.com/author/rdesouza/ DLA Piper's Global Privacy and Data Protection Resource Mon, 18 Sep 2023 15:11:57 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif Rachel De Souza | Privacy Matters | DLA Piper Data Protection and Privacy | DLA Piper https://privacymatters.dlapiper.com/author/rdesouza/ 32 32 European Commission adopts new adequacy decision for EU-US data flows https://privacymatters.dlapiper.com/2023/07/european-commission-adopts-new-adequacy-decision-for-eu-us-data-flows-2/ Tue, 11 Jul 2023 16:27:17 +0000 https://blogs.dlapiper.com/privacymatters/?p=3843 Continue Reading]]>

Authors: Jim Sullivan, Rachel De Souza, Heidi Waem, John Magee and David Brazil

On 10 July 2023, the European Commission adopted its long-awaited adequacy decision for the EU-US Data Privacy Framework (DPF). The DPF replaces the Privacy Shield Framework (Privacy Shield) which was invalidated by the Schrems II decision of the Court of Justice of the European Union (CJEU) in July 2020.  Effective immediately, the new adequacy decision allows personal data to flow from the European Economic Area (EEA) to DPF-certified US companies without the need for additional data protection safeguards.

Implications 

In a manner comparable to its predecessors, Privacy Shield and the EU-U.S. Safe Harbor Framework (Safe Harbor), the DPF enables certified companies that make legally binding commitments to comply with the DPF Principles (contained in Annex I to the adequacy decision) to receive personal data from the EEA without having to rely on EU-approved transfer mechanisms such as Standard Contractual Clauses (SCCs) or Binding Corporate Rules (BCRs) and to conduct Transfer Impact Assessments (TIAs). The European Commission has concluded that the United States (US) ensures an adequate level of protection, comparable to that of the European Union (EU), for personal data transferred from the EU to US companies under the new DPF.

For the nearly 3,000 companies that have maintained their Privacy Shield certifications since the Schrems II decision, the new adequacy decision should permit them to avail themselves of the updated DPF relatively quickly. Companies not currently certified would need to start the DPF certification process from scratch.

Any US organization that is subject to the jurisdiction of the Federal Trade Commission (FTC) or the Department of Transportation (DOT) may certify under the new DPF. While the FTC has broad authority over companies engaged in commerce, it does not have jurisdiction over, among others, non-profits, most depository institutions (banks, federal credit unions, and savings & loan institutions), and common carriers. In addition, the FTC’s jurisdiction with regard to insurance activities is limited to certain circumstances. Organisations that are ineligible or prefer not to rely on the DPF will still need to use SCCs, BCRs, or another transfer mechanism and carry out TIAs.

Legal Challenges

The European Commission has stated that the DPF introduces “significant improvements compared to the mechanism that existed under the Privacy Shield”.  An essential element of the US legal framework on which the adequacy decision is based concerns the Executive Order 14086 on ‘Enhancing Safeguards for United States Signals Intelligence Activities’ (EO). In particular, the EO directed US government agencies to take steps to implement several commitments, including:

  • Additional safeguards to limit US intelligence authorities’ access to personal data to what is necessary and proportionate to protect national security,
  • Enhanced oversight of US intelligence services’ activities to ensure compliance with limitations on surveillance activities, and
  • Redress mechanisms, including establishment of the Data Protection Review Court (DPRC) to which EEA individuals will have access. DPRC decisions regarding violations of applicable US law (and appropriate remediation) are legally binding and the DPRC will select a special advocate in each case to advocate on behalf of the complainant.

Notwithstanding the European Commission’s assertion that the binding safeguards implemented pursuant to the EO “address all the concerns raised by the European Court of Justice,” the DPF is expected to be contested. Max Schrems’ privacy organisation, NOYB, which led the previous legal challenges to both Privacy Shield and Safe Harbor, has already announced that it will also challenge the DPF. Characterizing it as “largely a copy of the failed ‘Privacy Shield,’” NOYB claims that “there is little change in US law or the approach taken by the EU” and that “[t]he fundamental problem with FISA 702 was not addressed by the US, as the US still takes the view that only US persons are worthy of constitutional rights.”  Given the previous invalidations of Privacy Shield and Safe Harbor by the CJEU, the long-term durability of the DPF remains a concern.

Next steps

For DPF-eligible organisations, the adequacy decision will significantly ease their compliance burdens. To participate in the DPF, they must certify through the new DPF website maintained by the U.S. Department of Commerce.

For those that are ineligible to certify under the DPF (i.e. not subject to FTC or DOT jurisdiction), SCCs and BCRs will likely remain the default transfer mechanisms. As they are not covered by the DPF, such organizations will still need to conduct TIAs, although the changes to US surveillance laws under the EO should simplify the TIA process.

The functioning of the DPF will be subject to periodic reviews, to be carried out by the European Commission, together with representatives of European data protection authorities and competent US authorities.

The first review will take place within a year of the entry into force of the adequacy decision, in order to verify that all relevant elements have been fully implemented in the US legal framework and are functioning effectively in practice.

For further information, please contact your usual DLA Piper lawyer.

]]>
EU: International data transfer rules for non-personal data https://privacymatters.dlapiper.com/2023/06/eu-international-data-transfer-rules-for-non-personal-data/ Tue, 27 Jun 2023 11:01:36 +0000 https://blogs.dlapiper.com/privacymatters/?p=3840 Continue Reading]]> Global flows of personal data have been a source of geopolitical concern for many years now. The Court of Justice of the European Union’s “Schrems II” judgement has revived the debate and organisations around the world now have to map personal data flows and conduct transfer impact assessments, while patiently awaiting the developments around the upcoming EU-US Data Privacy Framework.

To date, these compliance challenges were essentially limited to personal data processing, so some organisations and sectors had to do more than others. However, several legislative initiatives deriving from the EU Data Strategy now contain transfer restrictions for non-personal data, that will complement the EU’s already complex personal data transfer framework under the General Data Protection Regulation (GDPR), and will expand the issue of global data transfers to more organisations.

The Data Governance Act (DGA), the proposed Data Act and the proposed Regulation on the European Health Data Space (EHDS) provide the first comprehensive set of rules aimed at protecting EU corporations, and the public sector, against intellectual property theft, industrial espionage and public interest considerations. These new rules aim at ensuring that “protected” non-personal data is not transferred to countries outside the European Economic Area (EEA) without sufficient protection of intellectual property rights, trade secrets, confidentiality, and other EU interests.

For further information on the upcoming data transfer restrictions for non-personal data that will apply to companies and public bodies that fall under the scope of these EU data initiatives, please see our article available here.

]]>
CHINA: China SCCs filing procedure now published – more preparation work must be done, and filings will be scrutinized https://privacymatters.dlapiper.com/2023/06/china-china-sccs-filing-procedure-now-published-more-preparation-work-must-be-done-and-filings-will-be-scrutinized/ Thu, 08 Jun 2023 15:52:10 +0000 https://blogs.dlapiper.com/privacymatters/?p=3835 Continue Reading]]> With the Cyberspace Administration of China’s (“CAC”) release last week of the Guidelines for Filing of Standard Contracts for Cross-border transfers of Personal Information (“Guidelines”), organisations processing Mainland China personal data must now turn their attention to the China Standard Contractual Clauses (“China SCCs”) route for legitimizing their cross-border data transfers (“CBDTs”) of Mainland China personal data.

In short, the process is no longer a simple filing: in particular, the Personal Information Impact Assessment (“PIIA”) which must be filed alongside the China SCCs now seems closer to the full approval application required for the (more onerous) CAC Assessment route; the filed materials will be scrutinized and could be rejected; and more broadly non-compliance with the Personal Information Protection Law (“PIPL”) identified by the CAC via the China SCCs filing materials might lead to enforcement action. Therefore, it is critical for organisations subject to the China SCCs route to:

  • ensure their China data protection compliance programme has been properly designed and implemented;
  • identify and sign the China SCCs with relevant overseas recipients; and
  • file the China SCCs with primary overseas recipients – together with a fully-completed PIIA using the new template published by the CAC – by the relevant deadlines.

Who must follow the China SCCs route?

By way of reminder, personal information controllers who:

  • do not meet the thresholds for the CAC assessment/approval route (see our summary here); or
  • must follow the CAC certification route (primarily foreign personal information controllers of Mainland China personal information, see 10 below),

must follow the China SCCs route to legitimize access or transfers of personal information outside of Mainland China.

China SCCs route – compliance step plan

 Compliance stepActivity/ResponsibilityTimeline
1.Data mapping: understand CBDTs and approach to China SCCs

·         Undertake or update data mapping of CBDT of China data, to have a clear idea of flow of China CBDTs

·         Decide approach to SCCs preparation and filing (e.g. whether a filing is made as a group or on a per controller basis, etc.).

Now
2.Consent: separate, explicit consent from data subject to CBDT of China personal data

·         Review/update existing PRC privacy notice(s) and consent language, if not already done so, to ensure separate, explicit consent to CBDT (and general compliance with PIPL etc.).

·         Recent TC260 guidelines published in June 2023 provide practical, industry-specific guidelines as regards giving/obtaining notice and consent.

Now
3.PIIA: complete for each primary CBDT where PRC entity/ies is the personal information controller

Complete CBDT PIIA for each primary CBDT, using the new CBDT PIIA template (as per the Guidelines).

The new CBDT PIIA is now more detailed and will be reviewed and accepted/rejected/clarified by the local CAC; as such, the PIIA must be responded to fully and drafted very carefully. That said, the CAC’s expectations as to the level of detail to be included in the CBDT PIIA report (such as the mini-TIA section, descriptions of data transfers, onward recipients etc.) remain unclear (for example, as compared to the very detailed responses required for the CAC Assessment route).

Ready to file with accompanying signed China SCCs by no later than 30 November 2023, or otherwise within 10 days of signing the SCCs
4.China SCCs: PRC entity/ies acting as personal information controller re primary transfers

Personal information controller to sign China SCCs supplement with relevant overseas recipient(s) for each primary CBDT.

The CAC has emphasized that the content of the China SCCs cannot be amended or integrated into an existing contract. Organisations should, therefore, adopt / sign a standalone, bilingual supplement incorporating the China SCCs.

Signed China SCCs to file with accompanying PIIA by not later than 30 November 2023, or otherwise within 10 days of signing the SCCs
5.File signed China SCCs: PRC entity/ies acing as personal information controller re primary transfers

Personal information controller to file signed China SCCs with local CAC branch(es), along with the following filing materials:

·         the completed PIIA;

·         a certified copy of the personal information controllers’ unified social credit code certificate;

·         a certified copy of its legal representative’s ID card;

·         a certified copy of the appointed agent’s ID card;

·         a Power of Attorney appointing an agent handling the filing; and

·         a Commitment letter.

Unfortunately details of the actual process for making the filing with the local CAC branch(es) have not yet been published.

Hard and soft copies of filing materials must be filed no later than 30 November 2023, or otherwise within 10 days of signing the SCCs
6.CAC review of filed China SCCs/PIIA: respond to CAC questions (if any)

·         The local CAC will review the filing.

·         Respond to local CAC questions on filing (if any), and supplement filing materials as required.

The result of the review is on a pass / fail basis.

Local CAC review within 15 working days of receiving filing
7.China SCCs: flow down of China SCCs for onward transfers

Identify relevant vendor list, and flow down China SCCs to vendors procured by the organisation at a group level (i.e. onward transfers), even though no need to file with CAC.

The CAC has indicated that China SCCs currently only primarily concern first-tier overseas recipients, and C2C/C2P transfers, and not onwards transfers. In light of this, we have already seen pushback from some  vendors as regards entering into the China SCCs, except where they are the primary overseas recipients.

Now – to identify relevant vendor list.

To put in place China SCCs with these vendors at next engagement opportunity with relevant vendors.

8.Vendor management

Use organisation’s existing global vendor management data protection programme (e.g. due diligence, ongoing monitoring).

Organisations must reassess, amend, supplement or re-file China SCCs filings in the event of certain changes.

Ongoing
9.China SCCs – acting as data processor/client engagements (if relevant)Where organisations act as data processor, the relevant personal information controller may ask for China SCCs to be put in place: so organisations to assess, and determine their in-house approach to this and impact on standard service agreements etc.Now
10.Non-PRC data controllers within the organisation processing China personal data: await finalization of CAC certification routeDraft details of CAC certification scheme published and was subject to public consultation (so likely to change) (see CHINA: CBDT routes now all clear – Draft guidelines for CAC Certification route published – Privacy Matters (dlapiper.com))

Anticipated H2 2023

 

Broader compliance risks

If a China SCCs filing is rejected because the CAC identifies compliance gaps in an organisation’s general PIPL compliance programme (not just related to CBDT), this may trigger the CAC to take corresponding enforcement action under the PIPL (e.g. requesting remediation or issuing penalties). Therefore, it is critical for organisations to ensure their overall China data protection programme complies with applicable China data protection laws and regulations, and that this is accurately described within the CBDT PIIAs.

Please contact Carolyn Bigg, Amanda Ge or Venus Cheung if you would like to discuss what these latest developments mean for your organisation, or if you would like information about the templates and tools we have developed to support organisations with the above steps.

]]>
Ireland/EU: Irish DPC bans Meta’s EU-US data flows and issues record €1.2bn fine https://privacymatters.dlapiper.com/2023/05/ireland-eu-irish-dpc-bans-metas-eu-us-data-flows-and-issues-record-e1-2bn-fine/ Tue, 23 May 2023 08:27:09 +0000 https://blogs.dlapiper.com/privacymatters/?p=3830 Continue Reading]]> Decision could imperil other companies’ transatlantic transfers as well

By: John Magee, Andrew Dyson, James Sullivan, Andrew Serwin, Claire O’Brien & Rachel De Souza

The Irish Data Protection Commission (DPC) has published a decision that could impact the ability of thousands of companies to move personal data from the European Economic Area (EEA) to the United States.

For Meta, the decision has resulted in a record administrative fine of €1.2bn, an order to suspend further transfers of EEA personal data to the US within five months, and an order to cease all unlawful processing of EEA personal data transferred to the US in violation of GDPR, within six months.

Meta has announced that it will appeal the decision and seek a stay of the orders.

Background

At issue in the inquiry underlying the DPC’s decision was whether Meta’s transfers of EEA personal data to the US, based on Standard Contractual Clauses (SCCs), were legal following the Schrems II judgment by the CJEU nearly three years ago.

That judgment invalidated the EU-US Privacy Shield Framework, but also cast uncertainty on the use of SCCs to transfer personal data to the United States, given the concerns noted by the Court about the US government’s ability to access private sector data.

In the wake of the Schrems II ruling, Meta adopted the modernized SCCs issued by the European Commission in June 2021 and implemented supplementary measures as recommended by the European Data Protection Board (the EDPB) in November 2020 and June 2021.

In July 2022, the DPC first circulated its draft of this decision for review and comment by other European Supervisory Authorities (also known as Concerned Supervisory Authorities (CSAs)). That draft contained the transfer suspension order which forms part of this month’s final decision. After several CSAs lodged objections to perceived inadequacies of the draft decision in relation to the corrective measures proposed, the DPC referred the objections to the EDPB for determination pursuant to the Article 65 GDPR dispute resolution mechanism. The EDPB issued a binding determination to resolve the CSAs’ dispute over whether the DPC also should fine Meta and order it to bring its processing into compliance with the GDPR. The final DPC decision reflects that binding determination by the EDPB.

The DPC decision

The DPC’s decision records the exercise of the following corrective powers:

  1. an order, made pursuant to Article 58(2)(j) GDPR, requiring Meta to suspend any future transfer of personal data to the US within the period of five months from the date of notification of the DPC’s decision to Meta Ireland;
  1. an administrative fine in the amount of €1.2 billion; and
  1. an order, made pursuant to Article 58(2)(d) GDPR, requiring Meta to bring its processing operations into compliance with Chapter V of the GDPR, by ceasing the unlawful processing, including storage, in the US of personal data of EU/EEA users transferred in violation of the GDPR, within 6 months following the date of notification of the DPC’s decision to Meta Ireland.

Wider implications

Although the DPC decision is limited to the facts in the Meta matter, and Meta plans to appeal the decision and seek a stay, the announcement sends a decidedly unambiguous message to thousands of companies that the costs and complexities of delivering their products and services in certain markets will increase:

 “[T]he analysis in this Decision exposes a situation whereby any internet platform falling within the definition of an electronic communications service provider subject to the FISA 702 PRISM programme may equally fall foul of the requirements of Chapter V GDPR and the EU Charter of Fundamental Rights regarding their transfers of personal data to the USA.

More uncertainty and risk around EU-US data transfers

Coming amid the ongoing legal vacuum around EU-US data flows since the 2020 Schrems II judgment, the DPC’s suspension order threatens to disrupt Meta’s Facebook operations in Europe. Although it remains to be seen how the appeal process may play out or how Meta might adapt its practices to comply with the order, the social media giant has already indicated that it will find itself without the tools to lawfully transfer personal data to the United States for its Facebook service if an adequacy decision for the EU-US Data Privacy Framework (DPF) is not formally adopted before the suspension order takes effect in October.

Concerns about further disruption to transatlantic data transfers are by no means limited to Meta. In recent public financial filings with regulators, scores of US and European businesses have reported that, until the DPF is adopted by the EU, the uncertainties around SCCs could also impair their ability to process and transfer EEA personal data, thereby limiting their provision of particular products and services. By making the principal existing mechanism for transatlantic transfers so legally risky, the DPC decision may pressure EU companies to consider localising data (among other things, by shifting from US to European cloud service providers).

In particular, in its decision, the DPC concluded that Meta’s reliance on the ‘new’ 2021 SCCs do not compensate for the deficiencies in US law identified in Schrems II – the DPC held that the US Section 702 Foreign Intelligence Surveillance Act (FISA) downstream programme PRISM allows non-court supervised access to a user’s data without their knowing. Given that Meta cannot stop such access with the SCCs, there is no remedy for an EEA data subject who is not informed that they have been the subject of a FISA 702 search.

In addition, the DPC concluded that Meta did not have in place any supplemental measures which would compensate for the inadequate protection provided by US law. In particular, the supplementary measures do not “provide essentially equivalent protection to EU law against the wide discretion the US Government has to access Meta US users’ personal data via Section 702 FISA (PRISM) requests”.

While the DPC states that “the EDPB Supplemental Measures Recommendations do not exclude a so-called risk-based approach…”, Meta has not compensated for the inadequacies in US law. The DPC’s Meta decision (which reflects the EDPB’s binding determination) offers yet another indicator that European Supervisory Authorities are setting the bar high when it comes to supplementary measures used to protect EEA personal data, irrespective of the actual risk of access to such data by US public authorities. 

EU-US adequacy decision

For organisations transferring personal data to US service providers subject to FISA, the DPC decision leaves few alternatives other than to hope the EU-US adequacy decision for the DPF is adopted this summer as currently expected. The six-month grace period before the decision’s transfer suspension order takes effect clearly leaves the door open for the adequacy decision: “Accordingly, and for the sake of clarity and legal certainty, the orders specified in Section 10, below, will remain effective unless and until the matters giving rise to the finding of infringement of Article 46(1) GDPR have been resolved, including by way of new measures, not currently in operation, such as the possible future adoption of a relevant adequacy decision by the European Commission pursuant to Article 45 GDPR.”

Soundings from the negotiation process are indicating a finalized EU-US pact by the end of the summer.  However, given that privacy advocates are already planning legal challenges, any EU-US adequacy decision is likely to find its way back to the CJEU sooner rather than later.

Managing risk with ongoing data transfers

Organisations transferring personal data from the EEA to the US will be undoubtedly concerned as to whether the DPC decision undermines the ability to rely on SCCs and any supporting transfer impact assessments (TIAs) as a sufficient safeguard to continue to lawfully transfer data to the US pursuant to GDPR Article 46.

The DPC findings certainly put the validity of those safeguards under increased scrutiny, particularly given the pervasive nature of the concerns raised about US surveillance under FISA. What we don’t know yet though is whether European Supervisory Authorities will take downstream enforcement action that replicates the DPC’s findings narrowly, or if a more risk-based approach will start to emerge. There are certainly inferences in the DPC decision that suggest the door is not closed on a risk-based approach, as previously espoused by the EDPB. The facts and history of this particular case – involving the transfer of large volumes of personal data known to be in scope of the FISA 702 PRISM surveillance programme – were always going to trigger a material risk of regulatory enforcement. What is less clear is whether regulators will be equally inclined to suspend the flow of the more benign data transfers that many organizations make as part of their routine operations (for example when for remote IT support or managing global HR) where it may be possible to determine with a high degree of confidence that surveillance under FISA is unlikely to take place. Unless and until regulators go that far, we expect organisations to continue making transfers with respect to data sets that are considered ‘lower risk’. Pending confirmation of the DPF adequacy decision, transfers of ‘higher risk’ data will certainly be a concern and these should be identified and managed as a potential compliance exposure.

For further information or queries please contact one of the authors or any member of the DLA Piper Data Protection, Privacy and Cybersecurity team.

 

 

]]>
Europe: EU General Court Clarifies When Pseudonymized Data is Considered Personal Data https://privacymatters.dlapiper.com/2023/05/europe-eu-general-court-clarifies-when-pseudonymized-data-is-considered-personal-data/ Thu, 18 May 2023 11:05:52 +0000 https://blogs.dlapiper.com/privacymatters/?p=3829 Continue Reading]]> On 26th April, the General Court of the European Union (EGC), published its judgment in Case T-557/20, Single Resolution Board (SRB) v European Data Protection Supervisor (EDPS), in relation to the threshold between pseudonymous and anonymous data.

The EGC held that pseudonymised data transmitted to a data recipient will not be considered personal data if the recipient of the data does not itself have the means to re-identify the individuals. The EGC also clarified that an individual’s views and opinions are not, by default, personal data – a case-by-case assessment is required.

Background

As a result of a resolution scheme, affected shareholders and creditors were asked by the SRB to submit comments as part of the right to be heard process. The SRB then shared these comments with a third party consulting firm. The SRB pseudonymised the comments before sharing with the third party consulting firm by replacing the name of each individual with alphanumeric code. The consulting firm was not provided with the decoding key capable of linking the alphanumeric codes to individual respondents. As a result, a number of claims were filed to the EPDS against the SRB by affected shareholders and creditors, alleging that the SRB had failed to inform them that their personal data would be transmitted to third parties. The SRB argued that the data transmitted were anonymised and therefore were not personal data.

The EDPS held that SRB had shared pseudonymised data (rather than anonymised data) with the consulting firm –  the alphanumeric code used by the SRB allowed the SRB to link the comments to an individual,  even though the third party consulting firm did not have the decoding key and were not able to identify any individuals. As a result, the EDPS held that the personal data had been shared with the third party consulting firm in breach of the SRB’s information obligations.

The SRB appealed to the EGC, rejecting the EDPS determination that the information transmitted to the consulting firm constituted personal data.

Key takeaways from the EGC Decision

  • The EGC held that in order to determine whether the information transmitted to the third party constituted personal data, “it is necessary to put oneself in [the third party’s] position in order to determine whether the information transmitted to it relates to ‘identifiable persons’.” The information transmitted to the consulting firm did not constitute information relating to an ‘identified natural person’, as the data recipient did not have any means to re-identify the individuals –  SRB alone held the additional information that enabled the affected shareholders and creditors to be identified.
  • The EGC clarified that whether the data constitutes ‘personal data’ – and therefore falls within the scope of the GDPR – depends on whether the recipient has the means available to it to enable it to access the additional information necessary to re-identify the individuals. The fact that the sender has the means to re-identify individuals does not mean that the transmitted data is automatically also personal data in the hands of the recipient.  However, the EGC did not expressly state the specific conditions for data to be considered anonymous.
  • The EGC concluded that although personal views or opinions may constitute personal data, they cannot be presumed to contain personal data and instead an examination should be carried out of whether, by its “content, purpose or effect, a view is linked to a particular person”. 

Commentary

The EGC decision provides welcomed clarity on what information may constitute ‘personal data’, particularly in relation to the test for pseudonymised data versus anonymised data. In particular, the decision confirms that data received by a recipient which does not have the means available to enable that recipient to re-identify individuals, results in the data not amounting to personal data in the hands of the recipient – the fact the sender is able to re-identify the individuals does not mean that the transmitted data is automatically also personal data for the recipient. However, it should be noted that the EDPS will likely appeal this ruling to the Court of Justice of the European Union.

For further information or if you have any questions, please contact your usual DLA Piper contact.

]]>
Europe: CJEU holds that mere infringement of the GDPR does not give rise to a right to compensation https://privacymatters.dlapiper.com/2023/05/europe-cjeu-holds-that-mere-infringement-of-the-gdpr-does-not-give-rise-to-a-right-to-compensation/ Tue, 09 May 2023 08:43:27 +0000 https://blogs.dlapiper.com/privacymatters/?p=3825 Continue Reading]]> On 4 May 2023, European Court of Justice (“CJEU”) delivered its judgment regarding the interpretation of Article 82 of the General Data Protection Regulation (“GDPR”). The CJEU held that mere infringement of the GDPR does not give rise to a right to compensation. However, there is no requirement for the non-material damage suffered to reach a certain threshold of seriousness in order to confer a right to compensation.

Background

Since 2017, Austrian postal service (Österreichische Post AG), a logistics and postal service provider in Austria, published address directories and collected information on political party affinities of the Austrian population. Using an algorithm, it determined whether individuals had an affinity to a particular political party, based on certain socio-demographic features of their address information. The controller also carried out extrapolations in order to determine classifications within the possible target groups for election advertising from various political parties, although the processed data was not transferred to third parties. From one of these extrapolations, Österreichische Post determined that the data subject had a high affinity with one of the political parties.

The data subject, who had not consented to the processing of his personal data, was upset by the storage of his party affinity data and stated that the political affinity specifically attributed to him by Österreichische Post was insulting, shameful and damaging to his reputation. In addition, the data subject claimed that Österreichische Post’s conduct caused him great upset and a loss of confidence, and also a feeling of public exposure. The data subject claimed compensation of EUR 1000 in respect of non-material damage for “inner discomfort” and upset in light of the data processing by Österreichische Post.

Both the first instance court and appellate court dismissed the data subject’s claim, ruling that:

  • Compensation for non-material damages does not automatically accompany every breach of the GDPR, and only damage that goes beyond upset or the feelings caused by the breach are eligible for compensation; and
  • In accordance with underlying Austrian Law, mere discomfort and feelings of unpleasantness must be borne by everyone without any consequence in terms of compensation. For compensation, the damage ought to be of a certain significance.

Subsequently, the data subject lodged an appeal at the Austrian Supreme Court, which referred the following questions to the CJEU:

  1. Does the award of compensation under Article 82 GDPR also require, in addition to infringement of provisions of the GDPR, that an applicant must have suffered harm, or is the infringement of provisions of the GDPR in itself sufficient for the award of compensation?
  2. Does the assessment of the compensation depend on further EU-law requirements in addition to the principles of effectiveness and equivalence?
  3. Is it compatible with EU law to take the view that the award of compensation for non-material damage presupposes the existence of a consequence of the infringement of at least some weight that goes beyond the upset caused by that infringement?

CJEU Judgment

The CJEU held that:

  1. It is clear that the right to compensation provided for by the GDPR is subject to three cumulative conditions:
  • infringement of the GDPR,
  • material or non-material damage resulting from that infringement; and
  • a causal link between the damage and the infringement.

As such, it cannot be held that any ‘infringement’ of the provisions of the GDPR, by itself, confers that right to compensation on the data subject – there must be a causal link between the infringement in question and the damage suffered in order to establish a right to compensation. Article 82(1) of the GDPR must therefore be interpreted as meaning that the mere infringement of the provisions of that regulation is not sufficient to confer a right to compensation.

  1. As the GDPR does not contain any rules governing the assessment of damages, national courts must apply the domestic rules of each Member State relating to the criteria for determining the extent of financial compensation payable, provided that the principles of equivalence and effectiveness are complied with.
  1. Article 82(1) GDPR must be interpreted as precluding a national rule or practice which makes compensation for non-material damage subject to the condition that the damage suffered by the data subject has reached a certain degree of seriousness. The GDPR does not contain any such requirement and such a restriction would be contrary to the broad conception of ‘damage’, adopted by the EU legislature.

Commentary

The CJEU decision will likely be welcomed by organisations facing civil claims for damages following alleged breaches of the GDPR. In particular, the judgment confirms that a mere infringement is not sufficient to confer a right of compensation if there is no ‘causal link’ between the infringement in question and the damage suffered. The judgment highlights the need for data subjects to prove or clarify damage beyond loss of control over personal data and feelings of upset or discomfort. However, the decision does still leave some uncertainty for both controllers and data subjects –  leaving it to national courts to determine whether compensation is appropriate on a case by case basis, by reference to the domestic rules and case law of each relevant Member State.

This issue is one that is frequently litigated in the US, with somewhat different results due to the constitutional requirements of Article III of the US Constitution.  Those issues were highlighted in the Schrems II case and continue to create legal issues on both sides of the Atlantic.

For further information or if you have any questions, please contact your usual DLA Piper contact.

]]>
Europe: Advocate General delivers Opinion on the right of access for purposes unrelated to data protection https://privacymatters.dlapiper.com/2023/04/europe-advocate-general-delivers-opinion-on-the-right-of-access-for-purposes-unrelated-to-data-protection/ Tue, 25 Apr 2023 10:26:26 +0000 https://blogs.dlapiper.com/privacymatters/?p=3821 Continue Reading]]> Authors: Verena Grentzenberg and Katja-Maria Harsdorf

On 20 April 2023, the Advocate General (“AG”), Nicholas Emiliou, published his Opinion in the case of FT v DW, (C-307/22). In particular, the AG took the view that Art. 12(5) and Art. 15(3) GDPR must be interpreted as requiring a data controller to provide the data subject with a copy of his or her personal data, even where the data subject requests the copy for purposes unrelated to data protection. This Opinion, if confirmed by the European Court of Justice (“CJEU”), aligns with an increasing tendency, by both data protection supervisory authorities and courts, towards a broad interpretation of the right of access. The case could have far-reaching consequences for the assertion of claims, such as claims for damages, against companies in civil proceedings.

Course of proceedings and questions referred by the German Federal Supreme Court

The case was referred to the CJEU for a preliminary ruling by the German Federal Court of Justice.

In the case, a patient of a dental practice, who suspected a treatment error, requested that the dental practice provide him, free of charge, with a copy of all medical records concerning him that were in the possession of the dental practice. The patient requested the medical information in preparation for a medical malpractice action for damages before a civil court. The dental practice took the view that the medical records should only be provided to the patient if the patient reimbursed the costs.

The Local Court granted the patient’s claim in its judgement of 30 March 2020. On appeal by the dentist, the Regional Court confirmed the decision, by way of a judgement of 15 December 2020. In its reasoning, the Court stated that the right of access under Article 15(3) of the GDPR was not excluded by the fact that the patient requested the information in order to examine medical liability claims.

The dentist lodged an appeal before the German Federal Court of Justice. The German Federal Court of Justice, in its decision of 29 March 2022 (Case No. VI ZR 1352/20), decided to stay the proceedings and refer the questions to the CJEU for a preliminary ruling.  In particular, the German Federal Court of Justice referred the question as to whether Art. 12(5) and Art. 15(3) GDPR are to be interpreted as meaning that the controller is not obliged to provide the data subject with a first copy of his or her personal data free of charge, if the data subject requests the copy for purposes unrelated to data protection.

Further questions referred concerned the admissibility of an obligation to bear the costs for the information for the person concerned under national law as well as the question of whether the person concerned must be provided with a complete copy of the patient file.

Opinion of the Advocate General

In its Opinion, the AG proposed that the CJEU answer the first question referred as follows:

Articles 12(5) and 15(3) of the GDPR must be interpreted as requiring a data controller to provide the data subject with a copy of his or her personal data, even where the data subject does not request the copy for the purposes referred to in recital 63 of the GDPR, but for a different purpose, unrelated to data protection.

The AG justifies this Opinion on the basis of the broad wording of Art. 12(5) and Art. 15(3) GDPR. The systematics of the GDPR would also speak for a broad interpretation, since other provisions of the GDPR, e.g., Article 17(3) GDPR, provide for explicit exceptions of data subjects’ rights, but Article 12(5) and Article 15(3) of the GDPR do not. Although the AG concedes that the wording of recital 63 to Art. 15 GDPR is not entirely clear, he takes the view that it cannot be deduced from the recital that the right of access should be guaranteed exclusively for the purposes mentioned there (“in order to be aware of, and verify, the lawfulness of the processing“).

In contrast, according to the AG, national legislation that provides for a cost reimbursement obligation for patients may be permissible under certain circumstances based on Art. 23(1) GDPR; in particular, if the costs to be reimbursed are strictly limited to the actual costs incurred in this regard. Also, the AG states that in the context of a doctor-patient relationship, Article 15(3) GDPR cannot be interpreted as conferring on the data subject a general right to obtain a full copy of the documents included in his or her medical file. However, the controller is to provide the data subject with a partial or full copy of the documents, when that is necessary to ensure that the data provided is intelligible, and that the data subject is able to verify that the data provided is complete and accurate.

Comment

The AG’s position is not surprising. In the request for a preliminary ruling, the German Federal Court of Justice had already made it relatively clear that, in its view, the assertion of the right of access was not dependent on the pursuit of the purposes mentioned in recital 63 of the GDPR. The European Data Protection Board (EDPB) had also formulated the view in the “Guidelines 01/2022 on data subject rights – Right of access” – to which the AG Opinion also refers –  that the ‘controllers should not assess “why” the data subject is requesting access, but only “what” the data subject is requesting … and whether they hold personal data relating to that individual “. It can be assumed that the judgement of the CJEU will adopt the same approach, as the Court largely follows the Opinions of the Attorney General.

This would follow a growing trend of case law that assumes a broad interpretation of the right of access – in favour of data subjects and to the detriment of controllers. For example, the CJEU recently ruled in the case of RW v Österreichische Post AG, that when exercising their right of access under the GDPR, data subjects must be provided with the individual data recipients of their personal data (CJEU, judgment of 12 January 2023 – C-154/21 – RW v Österreichische Post AG) (see our blog post ). The German Federal Court of Justice also interpreted the scope of the right of access broadly in its judgment of 15 June 2021, when it ruled that, among other things, internal notes and internal communications about the data subject are not categorically excluded from the scope of the right of access pursuant to Art. 15(1) GDPR (German Federal Court of Justice, judgment of 15.6.2021 – VI ZR 576/19, para. 24 et seq.).

This broad interpretation of the right of access is increasingly subject to criticism – while it is  easy for data subjects to exercise their right of access, it often requires considerable effort and resources on the part of controllers to comply. Data subjects are therefore able to exert considerable pressure on companies by exercising their right of access.

If, as in the given case, the right of access is used to obtain evidence, this is likely to undermine central principles of German civil procedure. German civil procedure does not have any pre-trial discovery comparable to US civil procedure, which makes it possible to investigate the opponent, e.g., by inspecting business documents. Rather, to ensure equality of forces, the principle of production of evidence (‘Beibringungsgrundsatz’) applies, according to which each party is obliged to present facts relevant to the decision. Also, the basic rule of the distribution of the burden of presentation and proof is that each party must present the facts that are favourable to him or her and prove them in the event of a dispute. It is also recognized that it is not permissible to investigate an opponent without permission. This balance is obviously disturbed by a broad interpretation of the right to information under data protection law.

Nevertheless, controllers are not completely unprotected in the face of access requests.

Pursuant to Art. 12 (5) sentence 2 lit. b) GDPR, the controller may refuse to act on the request where requests from data subjects are “manifestly unfounded or excessive”. The general objection of abuse of rights is also applicable, e.g., if the data subject pursues objectives disapproved of by the legal system with his or her request, or acts fraudulently or vexatiously, which the German Federal Court of Justice also points out in its request for a preliminary ruling.

As a last bulwark against requests aiming at pre-trial discovery, national regulations based on Article 23 (1) (i) of the GDPR are likely to become particularly important. In Germany, for example, the Federal Data Protection Act (Bundesdatenschutzgesetz – ‘BDSG’) allows to refuse access if the data controller would otherwise disclose information that by its nature must be kept secret, in particular because of the overriding legitimate interests of a third party (Section 29 (1) sentence 2 BDSG). It is to be hoped that courts will recognise that based on this norm, data controllers may also refuse to disclose information that could otherwise be used as evidence against them in civil proceedings. If there are no corresponding national regulations, controllers will have to refer directly to Article 15 (4) of the GDPR.

In this context, it is encouraging that the Advocate General recognizes the possible restrictions of the right of access by “the rights and freedoms of others” in Art. 15(4), Art. 23(1)(i) and the Recital 63 of the GDPR and apparently interprets them relatively broadly. For instance, he expressly states that these also include restrictions that serve to protect some fundamental economic rights of individuals, including those of controllers.

For further information or if you have any questions, please contact your usual DLA Piper contact.

]]>
UK: Information Commissioner publishes response to new AI White Paper https://privacymatters.dlapiper.com/2023/04/uk-information-commissioner-publishes-response-to-new-ai-white-paper/ Thu, 13 Apr 2023 16:05:30 +0000 https://blogs.dlapiper.com/privacymatters/?p=3819 Continue Reading]]> Authors: Coran Darling and Rachel De Souza

On 29 March 2023, the UK Government (“Government”) published its long-awaited white paper (“Paper”), setting out the Government’s proposals to govern and regulate artificial Intelligence (“AI”). Headed “A Pro-Innovation Approach”, the Paper recognises the importance of building a framework that builds trust and confidence in responsible use of AI within the UK, while also acknowledging the risk of ‘overbearing’ regulation which may adversely impact innovation and investment. Further details of the Paper can be found in DLA Piper’s earlier analysis of the proposals.

The Information Commissioner’s Office and its role in AI

On 11 April 2023, the Information Commissioner’s Office (“ICO”), issued its response to the Government’s consultation on the Paper.

The ICO recognises the criticality of AI in the development of the UK’s prosperity, and the potential for regulation to push this in the correct direction. Equally, the ICO recognises AI’s immutable relationship with personal data. It is therefore no surprise that the ICO acknowledges that it plays a central role in the governance of AI. However, the ICO also recognises the important role that other UK regulators play in governing the use and development of AI in different sectors or contexts.

Comments, Concerns, and Considerations: ICO Response to AI White Paper Proposals

The Role of Regulators:

The response from the ICO on the regulator led approach in the Paper is largely positive – the ICO welcomes the Government’s intention to have regulators create joint guidance and ‘sandboxes’ for organisations. However, as regulators must create guidance and advice in alignment with the laws they oversee separately from the Government, the ICO has requested clarification on what the specific roles of regulators and the Government will be in issuing of guidance and advice, particularly where there is opportunity for overlapping and contradicting areas of oversight. The ICO proposes that one method to mitigate this is the use of collaborative initiatives, such as the Digital Regulation Cooperation Forum (DRCF), which already plays an active role in the analysis of the impacts of AI across the member regulator sectors.

Statutory Duties and AI Principles:

One of the primary aspects of the Paper is the creation of a set of principles, to which regulators should have due regard when implementing their guidance and monitoring their respective sectors. The ICO concludes that these principles “map closely to those found within the UK data protection framework”. However, the ICO recognises the need to work closely with the Government to ensure the Paper’s principles can be interpreted in a way that is compatible with data protection principles, “to avoid creating additional burden or complexity for businesses”.

The ICO provides specific comments which it hopes will assist to bring about consistency:

  • Fairness: The ICO states that the principle of fairness, in similar fashion to the data protection fairness principle, should be implemented across the stage of development of an AI System, as well as its use. The definition of the principle should therefore specifically include reference to development as part of its requirements.
  • Contestability and redress: Under this principle, regulators will be expected to clarify methods of contesting outputs/decisions made by AI and avenues of redress should things go wrong. As the ICO notes, it is typically the responsibility of the organisation using the AI system, not the regulator, that is expected to clarify these details. The ICO therefore  requests clarity  in relation to regulators’ roles in this area.
  • Interactions with UK GDPR Article 22: Under the Paper’s proposals, in instances of automated decisions that have a legal or significant effect on a person, regulators must consider the suitability of requiring AI system operators to provide an appropriate justification for that decision to affected parties. However, where Article 22 UK GDPR is engaged, this justification is required, rather than simply a consideration. The ICO therefore requests clarification on this point, to ensure that this does not create confusion and contradictory standards. While the Paper acknowledges that these types of conflicts may emerge, the ICO concludes that given a substantial portion of AI systems will process personal data, it is important for regulators to interpret the principles in the Paper in a way that is compatible with their meaning under UK data protection law.

Format of Guidance:

The Paper proposes that regulators should work together, where possible, to produce joint guidance.  The ICO recommends that the Government prioritise research into the types of guidance organisations involved in AI would benefit from.

Sandboxes:

In similar fashion to those we have seen in the EU, the Paper also proposes the creation of a joint regulatory sandbox for AI. The ICO recognises that this will provide clarity to AI developers on how the law will apply to varying use cases and assist with innovation and investment.

As in the case of guidance, the ICO notes that it would be prudent for the Government to carry out further research to determine where the best value to AI developers would be added.

Based on its own experience of operating regulatory sandboxes, the ICO recommends the following:

  • Scope of support: the ICO recommends that the scope of any sandbox should be extended to include all digital innovation, rather than just AI, noting that that it is unlikely that innovators will be strictly limited to AI and will want to progress a much wider family of technologies under the oversight of regulators.
  • Depth of support: the ICO states that timely support should be given, with the aim of providing clarification to businesses on the relevant laws. A more intensive and thorough testing in such an environment may limit the number of businesses that can be helped and may also only benefit businesses that are subject to very specific regulatory authorisation, such as medical devices. A balance should therefore be considered to assist as many organisations as possible.
  • Prioritisation of support: The ICO proposes that support should be prioritised in accordance with international best practice, focusing on: i) the degree of potential innovation; ii) the degree of regulatory barriers faced and support required; and iii) the potential for wider economic, social, and/or environmental benefit.

 Cost Implications:

The ICO recognises that the proposals in the Paper will incur new and additional costs to cross-economy regulators, including the ICO, which will now need to produce products tailored to different sectoral contexts in coordination with other relevant AI regulators. This may understandably strain certain regulatory budgets, and so the ICO welcomes discussions with the Government to enable the Paper’s wider proposals to succeed.

Find out more

For more information on AI and the emerging legal and regulatory standards, visit DLA Piper’s focus page on AI.

DLA Piper continues to monitor updates and developments of AI and its impacts on industry across the world. For further information or if you have any questions, please contact your usual DLA Piper contact.

 

]]>
Ireland: DPC Produces “Significant Outputs” for 2022 Concluding 17 Large Scale Inquiries https://privacymatters.dlapiper.com/2023/03/ireland-dpc-produces-significant-outputs-for-2022-concluding-17-large-scale-inquiries/ Thu, 30 Mar 2023 17:27:36 +0000 https://blogs.dlapiper.com/privacymatters/?p=3811 Continue Reading]]> By John Magee, Emer McEntaggart, Eilis McDonald, Nicole Fitzpatrick, Sarah Dunne, David Brazil & Christopher Connell

The Data Protection Commission (DPC) has published its 2022 Annual Report, highlighting the DPC’s progress on (i) ongoing large-scale inquiries (in particular against social media platforms), (ii) defence of cross-border decisions, and (iii) increased interaction with the European Data Protection Board (EDPB). Two-thirds of the GDPR fines issued by EU data protection authorities last year where from the DPC, illustrating a continued commitment to enforcement.

In total, the DPC received 5,695 valid notifications of personal data breaches in 2022 (a 13% reduction compared to 2021), with (i) 62% of these notifications being related to unauthorised disclosure of personal data, (ii) 105 being received under the ePrivacy Regulations (an increase of 176% compared to 2021), and (iii) 38 relating to the Law Enforcement Directive.

Most incidents reported originated from the private sector (52%), followed by the public sector (44%), with the remaining coming from the voluntary and charity sector (4%).

Complaints Handling

The Annual Report notes another year of extensive enforcement work by the DPC. In total 10,008 cases were concluded by the DPC in 2022 (a slight reduction on last years’ figures). As of 31 December 2022, the DPC had 88 statutory inquiries on-hand, including 22 cross-border inquiries. In addition to its cases and inquiries, the DPC also handled over 21,230 electronic contacts, 6,855 phone calls and 1,118 postal contacts.

The Annual Report highlights that the most frequent GDPR topics for queries and complaints in 2022 were access requests; fair processing; disclosure; direct marketing and right to be forgotten (delisting and/or removal requests). The most frequent cause of breaches, representing 62% of breaches in 2022, related to correspondence inadvertently being misdirected to the wrong recipients.

Administrative Fines and Large-Scale Inquiries

The Annual Report highlights 17 large-scale inquiries concluded throughout the year resulting in fines totalling over €1.3 billion.

Of note is the Meta (Instagram) decision which resulted in a record fine of €405 million and provided clarification on the lawfulness of processing children’s personal data in accordance with the legal bases of ‘performance of contract’ and ‘legitimate interest’. This is the second highest fine (after Luxembourg’s regulators issued a fine of €746 million last year) since the GDPR came into effect, and the largest fine to date issued by the DPC. The decision is currently before the High Court where Meta has sought an order quashing the decision and claims, amongst other things, that the fine amounts to a criminal sanction and is unconstitutional. Our analysis of the Meta (Instagram) decision can be read here.

Ongoing Inquiries

The Annual Report lists 7 inquiries at draft decision stage as of December 2022, 3 of which the DPC has now delivered its final decisions on (in early 2023).

Of note is the DPC’s ongoing inquiry into the Department of Social Protection which commenced in 2021. This inquiry involves the processing of personal data in relation to biometric facial templates used in the department’s registration process. The Annual Report confirms that a draft decision is currently underway.

In addition, as of 31 December 2022, the DPC has 15 cross-border decisions under Articles 60 and 65 at various stages from draft decision to investigation stage. The DPC has received objections and comments on its draft decision in the Meta own volition inquiry, relating to the lawfulness of Facebook’s transfers of personal data to the US (which is currently at the resolution stage). The Annual Report also confirms that the DPC is at an advanced stage in preparing a preliminary draft decision in the Google inquiry, relating to its processing of personal data by its real time bidding advertising technology system.

Litigation

The Annual Report highlights 9 litigation matters involving the DPC in which written judgments were issued in 2022; 6 statutory appeals, 1 judicial review, and 2 other appeals.

Of the 9 litigation matters, 6 were concluded and the other 3 remain subject to further appeals. Of the 6 matters that were concluded; 3 appeals taken against the DPC were dismissed; 1 judicial review was settled; 1 appeal was partially upheld and 1 appeal was fully upheld. The DPC expects to be involved in more Irish data protection litigation during 2023.

The judicial review proceedings taken by Schrems challenging the DPC’s inquiry into Facebook’s EU-US data transfers, are also referenced. Despite these proceedings being settled by Schrems and the DPC, the parties were unable to reach an agreement on costs. As such, in September 2022, the High Court ordered that the DPC pay 80% of Schrems’ costs. The High Court deducted 20% to reflect the fact that Schrems had not pursued an order quashing the DPC’s inquiry or for other reliefs.

2022 brought the first compensation case under section 117 of the Data Protection Act 2018 to proceed to hearing in Ireland. The case was however dismissed by the Circuit Court and a written decision from the Irish courts on how non-material damage will be assessed is still awaited. The case was taken after SIPTU (a trade union) inadvertently sent an email with the names and addresses of claimants to a group of other SIPTU members. The Judge, on hearing the nature of the claim, found that proof of more than minimal loss was necessary and that no evidence was presented of any actual loss suffered resulting from the email distribution.

International Activities

Over the course of 2022, the DPC continued to participate in work programmes of European supervisory bodies and, despite continued travel restrictions, the DPC attended and actively participated in monthly plenary meetings (approximately 300).

The DPC also continued to invest considerable resources in the day-to-day operation of the One-Stop-Shop in the performance of its role as Lead Supervisory Authority, including working with other authorities assisting on a broad range of matters and keeping other concerned authorities updated on issues and developments.

Data transfer issues have been an investment area for many enterprises across every sector. The DPC has been focusing on the assessment and approval of Binding Corporate Rules (BCRs) applications from multinational companies. During 2022, the DPC acted as lead reviewer in relation to 27 applications from 16 different companies; worked on obtaining approval for 3 of those applications and acted as co-reviewer or on drafting teams for Article 64 Opinions on 6 BCRs.

Children

The protection of children’s personal data is one of the DPC’s five strategic goals as set out in its 2022-2027 Regulatory Strategy.

Following on from its publication of ‘Fundamentals’ guidance on children’s data protection rights in 2021 the DPC issued three short guides for children aged 13 and over on their data protection rights in 2022. The aim of the guides is to inform children and enable good practices around online safety. The guides will also be helpful to parents, educators and other stakeholders.

In response to a public sector organization query on the use of social media advertising tools to target children or the parents of children under 16 the DPC advised that the organisation would likely to be a joint controller with the social media platform and would share responsibility for ensuring compliant processing. This means that the organisation needed to prepare necessary compliance documentation that set out the legal basis for processing. The DPC questioned if consent would be the most appropriate legal basis, due to the fact that children or parents cannot consent to targeted advertising if they must accept it as a condition for using the service in the first place. The DPC highlighted the relevance of this engagement to other public bodies. It stated that it cannot provide blanket endorsements of social media advertising tools, noting that they must be assessed on a case-by-case basis by organisations. The DPC also noted the “there is a lot of confusion around the appropriateness of consent as a lawful basis” and noted that alternate legal bases should be considered in light of any particular duties and obligations to children and any other relevant contextual factors.

Data Protection Officers

During 2022, the DPC continued its efforts to support the Data Protection Officer (DPO) Network, which was established in 2019. The DPC hosted 32 online webinars for members of the DPO Network on topics ranging from access requests to compiling records of processing activities. Another key focus of the DPC was to continue engagement with data controllers on Article 37 GDPR compliance (designation and notification of a DPO). Following numerous attempts by the DPC to engage with a public sector body, the DPC opened an Inquiry (in accordance with section 110(1) of the Data Protection Act 2018) into the Pre-Hospital Emergency Care Council, which was the last public body to be brought within compliance of Article 37. The Inquiry concluded in May 2022 resulting in a finding of infringements of Articles 31, 37(1) and 37(7) of the GDPR, i.e., failures to designate and provide contact details of the DPO, as well as failure to cooperate, on request, with the DPC in the performance of its tasks. This highlights the importance of the need for data controllers to understand the requirements of Article 37 and designation of DPOs and stresses the importance of cooperating and engaging with the DPC.

Supervision

A sectoral breakdown notes that of the 322 consultation requests received by the DPC during 2022, 135 (42%) were from public sector organizations, with the remainder from the private sector. The DPC also provided guidance and observations on 30 proposed legislatives measures.

Supervisory engagements undertaken by the DPC in 2022 included engagement with the technology sector on a range of projects including proposed amendments to lawful basis for core processing activities, the introduction of new privacy controls for end users, and transparency and child sexual abuse material. The DPC also engaged with the Financial Services Sector on the migration of customer database of mortgage holders following a large loan sale. It also provided guidance on issues of security of transfer, accuracy of data, providing information to customers and ensuring customers’ data protection rights were not adversely affected.

The Annual Report highlights the DPCs accomplishments in 2022, including the conclusions of 17 large-scale investigations that resulted in fines totaling over €1.3 billion. The DPC’s data subject centric approach to complaint resolution, and its handling of own volition inquiries continues. The DPC will also continue to monitor and enforce compliance particularly in relation to breach and access rights and the processing of children’s data.

]]>
New Zealand: Digital Identity Services Trust Framework Bill passes final reading https://privacymatters.dlapiper.com/2023/03/new-zealand-digital-identity-services-trust-framework-bill-passes-final-reading/ Thu, 30 Mar 2023 09:25:06 +0000 https://blogs.dlapiper.com/privacymatters/?p=3809 Continue Reading]]> Authors: Alex Moore (Associate, Auckland) and Nick Valentine (Partner, Auckland) 

On 30 March 2023, the Digital Identity Services Trust Framework Bill (the Bill) passed its third and final reading in New Zealand’s House of Representatives, with cross-party support. The Digital Identify Services Trust Framework Act will come into effect on 1 July 2024 (at the latest).  It is a ‘flagship initiative’ under the current Government’s Digital Strategy for Aotearoa New Zealand, and will establish a voluntary accreditation scheme for digital identity service providers, similar to existing frameworks in the United Kingdom, Canada and Australia.

What is it?

The Framework is similar to the equivalent scheme in Australia – digital identification verification service providers who opt-in will be required to adhere to a set of trust framework rules (the TF Rules) and in return will be granted the right to use a mark accrediting their services. Individuals and businesses that use the identity verification services are not required to be accredited.

Again, in alignment with the Australian framework, the Bill establishes two administrative bodies: the Trust Framework Board (the TF Board) and the Trust Framework Authority (the TF Authority).  The TF Board will take on governance responsibilities for the framework, including providing guidance about the framework, monitoring its performance and advising the Minister on making and updating the trust framework rules. The TF Authority will be responsible for the day-to-day operations of the framework, including assessing accreditations, investigating complaints, enforcing the TF Rules, and granting remedies for breaches.

While the Bill represents an important building block of the Government’s Digital Strategy, the Bill itself does not establish the TF Rules. The TF Rules will instead be set out in Secondary Legislation made by the Minister and will, at a minimum, cover requirements for identification management, privacy and confidentiality, security and risk, information and data management, and sharing and facilitation.

What do we think?

Although the Bill is a step in the right direction for encouraging public trust of digital services, it does very little to grapple with the bigger issues of the online world, such as rights to digital identity and the data associated with an individual’s online presence and interactions.

In many ways, the Bill is reflective of the slow and usually toothless approach to digital governance in New Zealand to date. New Zealand is often playing catch up when it comes to regulating digital technologies and, given it has taken 18 months to get to this stage without actually establishing any substantive rules for the provision of secure and trusted digital identity services, the final passage of the Bill through the House feels a little underwhelming. Particularly as the scheme is voluntary, largely based on the equivalent Australian framework and was developed with the benefit of learning from similar frameworks in the United Kingdom and Canada.

What’s unique?

Despite its clear Commonwealth influences, the Bill does introduce an important element which is unique to Aotearoa – the need to consider te ao Māori (broadly, the Māori worldview including tikanga Māori – Māori customs and protocols) approaches to identity when developing the TF Rules. The Bill establishes a Māori Advisory Group which the TF Board will be required to consult with prior to advising the Minister on the making of TF Rules and the TF Board must also include members “with expert knowledge of te ao Māori approaches to identity.” These consultation and participation requirements are intended to facilitate equitable Māori participation in the digital environment and recognise the Government’s commitment to the principle of partnership under te Tiriti o Waitangi (the founding document of colonial New Zealand).

In a rather satirical twist, the legislative process itself became the victim of authenticity issues in the online world as a result online misinformation campaigns during the height of the COVID-19 pandemic. Of the roughly 4,500 public submissions on the Bill, around 4,050 of those were received during the last two days of the six week public consultation period, including 3,600 submissions in the final three hours. Parliamentary advisers attributed this influx to “misinformation campaigns on social media that caused many submitters to believe that the Bill related to COVID-19 vaccination passes.” Perhaps this incident was evidence enough that New Zealand needs to take a more proactive approach to regulating the digital environment, as the Bill ended up attracting cross-party support.

What’s next?

While the Bill’s passage itself is nothing to write home about, it will be interesting to see how the framework grapples with te ao Māori perspectives on identity in practice. Hopefully, the cross-party support this Bill garnered will energise the Government to tackle some of the bigger digital rights and privacy issues we are currently facing, both nationally and globally.

]]>