Magda Zmorka | Privacy Matters | DLA Piper Data Protection and Privacy | DLA Piper https://privacymatters.dlapiper.com/author/mzmorka/ DLA Piper's Global Privacy and Data Protection Resource Wed, 24 Jan 2024 14:58:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif Magda Zmorka | Privacy Matters | DLA Piper Data Protection and Privacy | DLA Piper https://privacymatters.dlapiper.com/author/mzmorka/ 32 32 CHINA: new Anti-Espionage Law and its impact on your China data and operations – how your organisation should respond https://privacymatters.dlapiper.com/2023/05/china-new-anti-espionage-law-and-its-impact-on-your-china-data-and-operations-how-your-organisation-should-respond/ Tue, 16 May 2023 07:05:32 +0000 https://blogs.dlapiper.com/privacymatters/?p=3827 Continue Reading]]> Authors: Carolyn Bigg, Amanda Ge, Venus Cheung, Gwyneth To

China’s amended Anti-Espionage Law will take effect from 1 July 2023. However, its effects have already been felt by some international businesses. So what should international businesses do to respond to these new risks?

The new law broadens the scope of espionage activities, as well as the power for authorities to carry out anti-espionage investigations by gaining access to data and property.

Following the observation of increased enforcement to target anti-espionage activities, organisations are advised to focus on adopting internal governance mechanisms to ensure compliance with the relevant laws, as well as being ready to react to any potential enforcement action in a responsive manner.

Applicability and extra-territorial effect

The new law applies to a widened scope of espionage activities, and can potentially impact different types of data and activities.

In particular, those organisations dealing with state secrets should be aware of the far-reaching applicability of the new law. Given the uncertainty in what constitutes state secrets, organisations should constantly review, assess risks, and be attentive to the types of data that is processed as part of their business operations.

With this in mind, organisations which deal with more sensitive types of data such as defence and advanced technology should take extra care in remaining compliant with the law (including keeping such data within Mainland China unless relevant approvals are obtained). Additionally, organisations which have contact with national security authorities should ensure all communications and interactions are kept confidential within the organisation.

Notably, the new law does not limit espionage activities to those carried out within China. This said, the focus appears to be on activities that may, in any way, impact national security and public interests of China.

The new law also applies to espionage activities against third countries that are carried out by espionage organisations and their agents within the territory of China or otherwise involve Chinese citizens, organisations, or other conditions, so long such activities endanger the national security of China. Thus, activities not specifically targeting China may also fall into the regulatory scope.

Managing data risks

Both local and foreign organisations should be mindful of the significance of this new law if they have China-related business activities or connections.

One of the key internal data governance actions that an organisation should prioritise in connection with compliance with this new law is to conduct data mapping and classification in order to maintain an accurate data inventory and to ensure there is clear understanding of its data flows and processing activities. As noted above, this is particularly important with regard state secrets and “important data”. Data compliance programmes should extend beyond just personal data to cover these other China data categories; and should include education on such restrictions and sensitivities beyond just China personnel.

Authorities’ powers

During the course of carrying out anti-espionage investigations, national security authorities are now granted the power to access official buildings and factories, requisition transportation and communication tools, check personal IDs and belongings, examine and seal up electronic devices, review and obtain documents and materials, summon and interview relevant stakeholders, freeze and seize properties, impose border entry and exit restrictions, and shut down websites and networks.

What to do in the event of regulatory investigations / dawn raid

In the event of regulatory investigations, representatives of organisations should:

  • first ensure investigators have due authority and due procedures are followed;
  • refer to internal investigation/dawn raid guidelines, and follow the detailed step-by-step guidance on dealing with authorities’ enquiries or investigations. In particular, follow proper internal reporting and escalation procedures in case of dawn raids; and
  • keep records of the data and information provided to regulators as part of the dawn raid.
]]>
Europe: Opinion of the Advocate General on presumed fault of the controller in case of unlawful third-party access to personal data https://privacymatters.dlapiper.com/2023/04/europe-opinion-of-the-advocate-general-on-presumed-fault-of-the-controller-in-case-of-unlawful-third-party-access-to-personal-data/ Thu, 27 Apr 2023 13:34:42 +0000 https://blogs.dlapiper.com/privacymatters/?p=3823 Continue Reading]]> Authors: Verena Grentzenberg, Andreas Rüdiger, Ludwig Lauer

In his Opinion of 27.04.2023 (C 340/21), the Advocate General of the European Court of Justice (“ECJ”) commented on the interpretation of the civil non-material right to damages pursuant to Article 82 (1) GDPR as well as on the requirements and the duty of disclosure of the technical and organizational measures pursuant to Articles 24, 32 GDPR in the event of a cyber-attack in the context of a reference for a preliminary ruling of Bulgarian origin.

Facts of the case

The Bulgarian authority “National Revenue Agency” (hereinafter referred to as “NAP”) was target of a cyber-attack which led to unauthorized access to NAP’s information system. In the course of this cyber-attack, personal data – mainly tax and social security information – of approximately 4 million Bulgarian citizens (or approximately 6 million citizens in total, including foreign citizens) had been accessed and published on the Internet. Among them is also the plaintiff.

In the proceedings at first instance before the Administrative Court of the City of Sofia (hereinafter referred to as “ASSG”), the plaintiff demanded an amount of approx. 500 EUR on the grounds of a legal infringement arising from Article 82 (1) GDPR. He argued that NAP had failed to ensure its cybersecurity in an appropriate manner. In the opinion of the plaintiff, the failure to apply appropriate technical and organizational measures in accordance with Articles 24, 32 GDPR resulted in a breach of the protection of personal data. The plaintiff expressed his non-material damage suffered in the form of worries, fears and anxieties about possible future misuse of his personal data.

The NAP, as the defendant, considered the claim to be unfounded. The NAP argued that a cyber-attack does not allow per se conclusions to be drawn about a lack of technical and organizational measures. The NAP argued that it had been the victim of a cyber-attack by third parties who were not its employees and could therefore not be (co-)responsible for the damage incurred and therefore is exempted from liability pursuant to Article 82 (3) GDPR.

Decisions of the court of first instance and referral to the ECJ

The ASSG dismissed the claim, taking the view that the dissemination of the data was not attributable to the NAP, that the burden of proof as to whether the measures implemented were appropriate was on the plaintiff, and that non-material damage was not eligible for compensation.

Hearing the case on appeal, the Bulgarian Supreme Administrative Court referred a number of questions to the ECJ with regard to

  • the presumption that technical and organisational measures in accordance with Art. 32 GDPR are not sufficient in case a cyber-attack occurs;
  • the subject matter and scope of the judicial review re. the appropriateness of technical and organizational measures;
  • the controller’s burden of proof that the technical and organisational measures are appropriate;
  • the exemption of liability under Art. 82 (3) GDPR in connection with cyber-attacks; and
  • the threshold for the non-material damages under Art. 82 (1) GDPR.

Statements of the Advocate General of the ECJ

The core statements of the Advocate General of the ECJ are as follows:

  • According to the Advocate General, the occurrence of a “personal data breach” is not sufficient in itself to conclude that the technical and organisational measures implemented by the controller were not “appropriate” to ensure data protection. The assessment of the appropriateness of those measures must be based on a balancing exercise between the interests of the data subject and the economic interests and technological capacity of the controller, in compliance with the general principle of proportionality.
  • Further, the Advocate General states that, when verifying whether the measures are appropriate, the national court must carry out a review which extends to a specific analysis of the content of those measures and the manner in which they were applied, as well as of their practical effects.
  • The Advocate General states that the burden of proving that the technical and organisational measures are appropriate is on the controller. In accordance with the principle of procedural autonomy, it is for the national legal order of each Member State to determine the admissible methods of proof and their probative value, including the measures of inquiry.
  • The fact that the infringement of the GDPR was committed by a third party does not in itself constitute a ground for exempting the controller. In order to be exempted from liability, the controller must demonstrate, to a high standard of proof, that it is not in any way responsible for the event giving rise to the damage. The unlawful processing of personal data has, in fact, the nature of aggravated liability for presumed fault, which gives rise to the possibility for the controller to provide exonerating evidence.
  • Finally, according to the Advocate General, detriment consisting in the fear of a potential misuse of one’s personal data in the future, the existence of which the data subject has demonstrated, may constitute non-material damage giving rise to a right to compensation, provided that it is a matter of actual and certain emotional damage and not simply trouble and inconvenience.

Conclusion

Although the opinion of the Advocate General is not binding for the ECJ, it is to be expected that the ECJ will in general adopt the opinion of the Advocate General in its final judgement. In case the ECJ will follow the opinion of the Advocate General, this judgment will have huge impact and relevance for data processing companies. As the numbers of cyber-attacks increase constantly, in general any company can be affected by a cyber-attack. It is therefore of utmost importance to be prepared for such an eventuality and to review and, if necessary, amend the implemented technical and organisational measures in accordance with Art. 32 GDPR. Even though a cyber-attack can probably never be completely prevented, it is highly recommended in the light of the opinion of the Advocate General and the associated burden of proof for the companies concerned, to regularly check the technical and organizational measures as part of internal audits and to ensure sufficient documentation which is appropriate to be used in court. Such audits also need to cover processors and even sub-processors. Furthermore, contracts with processors and sub-processors need to adequately address not just the allocation of responsibility, but also court-proof documentation.

]]>
Germany: ECJ ruling on employee data protection https://privacymatters.dlapiper.com/2023/03/germany-ecj-ruling-on-employee-data-protection/ Fri, 31 Mar 2023 14:55:26 +0000 https://blogs.dlapiper.com/privacymatters/?p=3814 Continue Reading]]> Authors: Eleni Alexiou, Katharina Pauls

On 30 March 2023, the European Court of Justice (ECJ) ruled on the requirements for national legal bases regarding employee data protection in the context of a referral procedure. Based on its ruling, the German provision that gave rise to the referral procedure (Sec. 23 (1) sentence 1 of the Data Protection and Freedom of Information Act of the German federal state of Hessen – HDSIG) is in all likelihood contrary to EU law, as it does not comply with the requirements of the opening clause in Art. 88 of the General Data Protection Regulation (GDPR). Sec. 23 (1) sentence 1 HDSIG, as a key provision regarding employee data protection, regulates the permissibility of processing personal data by public bodies in the German federal state of Hessen. Since the provision is almost identical in wording to the corresponding key German federal provision in Sec. 26 (1) sentence 1 of the Federal Data Protection Act (BDSG), the ruling is also of nationwide significance in the non-public sector. Employers in Germany have so far extensively relied on Sec. 26 (1) sentence 1 BDSG as the legal basis for processing personal data of their employees. This practice should now be reviewed and – where necessary – adjusted.

Background

The case decided by the ECJ (C-34/21) is based on a reference for a preliminary ruling of the Administrative Court of Wiesbaden. The referring court dealt with Sec. 23 (1) sentence 1 HDSIG in the context of a legal dispute and doubted its conformity with EU law. Due to the opening clause in Art. 88 GDPR, national legislators have a wide scope for legislation regarding the processing of personal employment data in the context of employees and may create their own legal bases in this respect. However, according to Art. 88 (1) GDPR, an essential prerequisite for this is that the provisions are “more specific”. Sec. 23 (1) sentence 1 HDSIG, though, merely reflects the content of Art. 6 (1) (b) GDPR, which is why the referring court asked itself whether this can be a “more specific” provision. The court then asked the ECJ, on the one hand, what makes a legal provision more specific within the meaning of Art. 88 (1) GDPR and, on the other hand, whether a provision can remain applicable even if it does not meet the requirements of the opening clause.

In his opinion published on 22 September 2022, the Advocate General expressed that he does not consider the requirements of the opening clause to be met in the provision of the German federal state of Hessen. It would merely be a repetition of the provisions of Art. 6, 88 (1) and 5 GDPR and not a more specific provision as required by Art. 88 (1) GDPR. A legal provision issued by a member state is only a “more specific” provision within the meaning of Art. 88 (1) GDPR if it meets the specific requirements of Art. 88 (2) GDPR, which is not the case with Sec. 23 (1) sentence 1 HDSIG. The provision is therefore considered to be contrary to EU law and superfluous.

The ECJ has now endorsed the view of the Advocate General in its ruling issued on 30 March 2023. According to the ECJ, a “more specific provision” within the meaning of Art. 88 GDPR may not be limited to repeating the provisions of the GDPR but must meet the requirements of Art. 88 (2) GDPR and therefore include “specific measures to safeguard the human dignity, legitimate interests and fundamental rights of the data subject[s]”. Even though it is ultimately the responsibility of the Administrative Court of Wiesbaden to decide whether Sec. 23 (1) sentence 1 HDSIG meets these requirements, the ECJ makes it clear in its ruling that in its view this is not the case: Sec. 23 (1) sentence 1 HDSIG merely repeats the conditions for lawful processing set out in Art. 6 (1) (b) GDPR. It can be assumed that the Administrative Court of Wiesbaden shares this view. This would have the consequence that Sec. 23 (1) sentence 1 HDSIG would in principle be inapplicable and could not constitute an effective legal basis for the processing of personal data. An exception to this – based on the ruling of the ECJ – would only apply if the provision constituted a legal basis within the meaning of Art. 6 (3) GDPR, which, if at all, would only be the case in narrowly defined scenarios.

The wording of the provision of the German federal state of Hessen largely corresponds to that of the corresponding German federal provision, Sec. 26 (1) sentence 1 BDSG. German employers broadly rely their processing of personal data in the employment context on this provision, as it allows processing for hiring decisions or, after hiring, for carrying out or terminating the employment contract, if this is necessary. Accordingly, the ECJ’s ruling is also likely to have an indirect impact on the German federal provision. Even though the ECJ did not directly address Sec. 26 (1) sentence 1 BDSG (due to lack of relevance to the questions to be answered), it does mention the provision in its ruling as part of the “legal framework” and thus emphasizes its relevance at the federal level as well. Even though this may seem alarming at first, the consequences of the ruling are quite limited at second glance. Nevertheless, employers should now take action (please see below).

Consequences of the decision for the practice

In practice, the question arises as to what concrete significance the ECJ’s decision has for German employers with regard to the processing of employee data.

In light of the ECJ ruling, it is indeed possible that Sec. 26 (1) sentence 1 BDSG cannot be classified as a “specific provision” within the meaning of Art. 88 GDPR either and is therefore (in general) inapplicable. Just like Sec. 23 (1) sentence 1 HDSIG, the German federal provision does not contain a more specific regulation that goes beyond the GDPR.

Furthermore, it seems likely that Sec. 26 (3) BDSG could also be contrary to EU law for the same reasons: This provision regulates the processing of special categories of personal data, such as health data, of employees. However, the processing requirements are only very limitedly more specific than the corresponding regulation in Art. 9 (2) (b) GDPR. In addition, only a further balancing of interests is included (“the processing … is permissible if … there is no reason to assume that the data subject’s legitimate interest in the exclusion of the processing overrides”).

Even if this has not yet been decided by the courts, employers are already well advised at this point in time to cite Art. 6 (1) b GDPR as a legal basis at least in addition to Sec. 26 (1) sentence 1 BDSG or Art. 9 (2) b GDPR in addition to Sec. 26 (3) BDSG. However, this should not change which processing activities are permitted to them. For documents such as privacy notices, records of processing activities and/or data protection impact assessments, in which Sec. 26 (1) sentence 1 BDSG is mentioned as a legal basis, an adjustment is recommended as part of a routine audit.

The specific legal basis for the investigation of criminal acts committed by employees (Sec. 26 (1) sentence 2 BDSG) is also likely to remain applicable. This is because it contains requirements that are more specific than the otherwise relevant legal basis of processing on the basis of legitimate interests (Art. 6 (1) (f) GDPR). It is precisely this provision and its specific requirements that are often overlooked by international companies.

The question of the conformity of Sec. 26 BDSG with EU law has been discussed for some time already (so far, the Federal Labor Court has assumed compatibility with EU law). Due to its open wording the provision has also led to legal uncertainties in other respects when assessing the data protection conformity in various matters. In order to address these legal uncertainties, more specific regulations have been demanded by the German legislature and German data protection authorities already over a long period of time. The fact that these are necessary is now confirmed by the present ruling of the ECJ. With reference to this ruling, the Data Protection Authority Hamburg (“Hamburgische Beauftragte für Datenschutz und Informationsfreiheit”) also emphasizes the importance of implementing a new and concise legal framework regulating the processing of personal data in the employment context and has promised to actively take action in pushing forward new legislative measures. While previous legislative initiatives have all failed it remains to be seen whether the German legislature will now truly go forward and also complete such legislative processes.

]]>
A Pro-Innovation Approach: UK Government publishes white paper on the future of governance and regulation of artificial intelligence https://privacymatters.dlapiper.com/2023/03/a-pro-innovation-approach-uk-government-publishes-white-paper-on-the-future-of-governance-and-regulation-of-artificial-intelligence/ Fri, 31 Mar 2023 10:02:29 +0000 https://blogs.dlapiper.com/privacymatters/?p=3812 Continue Reading]]> Authors: James Clark, Coran Darling, Andrew Dyson, Gareth Stokes, Imran Syed & Rachel de Souza

In November 2021, the UK Government (“Government”) issued the National Artificial Intelligence (AI) Strategy, with the ambition of making the UK a global AI superpower over the next decade. The strategy promised a thriving ecosystem, supported by Government policy that would look at establishing an effective regulatory framework; a new governmental department focussed on AI and other innovative technologies; and collaboration with national regulators.

On 29 March 2023 the Government published the long-awaited white paper (“Paper”) setting out how the UK anticipates it will achieve the first, and most important, of these goals – the creation of a blueprint for future governance and regulation of AI in the UK. The Paper is open for consultation until 21 June 2023.

The Paper, headed “A pro-innovation approach”, recognises the importance of building a framework that engenders trust and confidence in responsible use of AI (noting the key risks to health, security, privacy, and more, that can arise through an unregulated approach), but cautions against ‘overbearing’ regulation which may adversely impact innovation and investment.

This theme runs throughout the Paper and expands into recommendations that support a relatively light touch, and arguably a more organic regulatory approach, than we have seen in other jurisdictions. This is most notably the case when compared to the approach of the EU, where the focus has been on development of a harmonizing AI-specific law and supporting AI-specific regulatory regime.

The Paper contends that effective AI regulation can be constructed without the need for new cross-sectoral legislation. Instead, the UK is aiming to establish “a deliberately agile and iterative approach” that avoids the risk of “rigid and onerous legislative requirements on businesses”. This ambition should be largely achieved by co-opting regulators in regulated sectors to effectively take direct responsibility for the establishment, promotion, and oversight of responsible AI in their respective regulated domains. This would then be supported by the development of non-binding assurance schemes and technical standards.

Core Principles

This approach may be different in execution from the proposals we are seeing come out of Europe with the AI Act. If we look beneath the surface, however, we find the Paper committing the UK to core principles for responsible AI which are consistent across both regimes:

  • Safety, security, and robustness: AI should function in a secure, safe, and robust manner, where risks can be suitably monitored and mitigated;
  • Appropriate transparency and explainability: organisations developing and deploying AI should be able to communicate the method in which it is used and be able to adequately explain an AI system’s decision-making process;
  • Fairness: AI should be used in ways that comply with existing regulation and must not discriminate against individuals or create unfair commercial outcomes;
  • Accountability and governance: appropriate measures should be taken to ensure there is appropriate oversight of AI systems and there are adequate measures to follow accountability; and
  • Contestability and redress: there must be clear routes to dispute harmful outcomes or decisions generated by AI.

The Government intends to use the principles as a universal guardrail to guide the development and use of AI by companies in the UK. This approach that aligns with international thinking that can be traced back to the OECD AI Principles (2019), the Council of Europe’s 2021 paper on a legal framework for artificial intelligence, and recent Blueprint for an AI Bill of Rights proposed by the White House’s Office of Science and Technology Policy.

Regulator Led Approach

The UK does not intend to codify these core principles into law, at least for the time being. Rather, the UK intends to lean on the supervisory and enforcement powers of existing regulatory bodies, charging them with ensuring that the core principles are followed by organisations for whom they have regulatory responsibility.

Regulatory bodies, rather than lawmakers or any ‘super-regulator’, will therefore be left to determine how best to promote compliance in practice. This means, for example, that the FCA will be left to regulate AI across financial services; the MHRA to consider what is appropriate in the field of medicines and medical devices; and the SRA for legal service professionals. This approach is already beginning to play out in some areas. For example, in October 2022, the Bank of England and FCA jointly released a Discussion Paper on Artificial Intelligence and Machine Learning (DP5/22), which is intended to progress the debate on how regulation and policy should play a role in use of AI in financial services.

To enable this to work, the Paper contemplates a new statutory duty on regulators which requires them to have due regard to the principles in the performance of their tasks. Many of these duties already exist in other areas, such as the so-called ‘growth duty’ that came into effect in 2017 which requires regulators to have regard to the desirability of promoting economic growth. Regulators will be required by law to ensure that their guidance, supervision, and enforcement of existing sectoral laws takes account of the core principles for responsible AI. Precisely what that means in practice remains to be seen.

Coordination Layer

The Paper recognises that there are risks with a de-centralised framework. For example, regulators may establish conflicting requirements, or fail to address risks that fall between gaps.

To address this, the Paper announces the Government’s intention to create a ‘coordination layer’ that will cut across sectors of the economy and allow for central coordination on key issues of AI regulation. The coordination layer will consist of several support functions, provided from within Government, including:

  • assessment of the effectiveness of the de-centralised regulatory framework – including a commitment to remain responsive and adapt the framework if necessary;
  • central monitoring of AI risks arising in the UK;
  • public education and awareness-raising around AI; and
  • testbeds and sandbox initiatives for the development of new AI-based technologies.

The Paper also recognises the likely importance of technical standards as a way of providing consistent, cross-sectoral assurance that AI has been developed responsibly and safely. To this end, the Government will continue to invest in the AI Standards Hub, formed in 2022, whose role is to lead the UK’s contribution to the development of international standards for the development of AI systems.

This standards-based approach may prove particularly useful for those deploying AI in multiple jurisdictions and has already been recognised within the EU AIA, which anticipates compliance being established by reference to common technical standards published by recognised standards bodies. It seems likely that over time this route (use of commonly recognised technical standards) will become the de facto default route to securing practical compliance to the emerging regulatory regimes. This would certainly help address the concerns many will have about the challenge of meeting competing regulatory regimes across national boundaries.

International comparisons

EU Artificial Intelligence Act

The proposed UK framework will inevitably attract comparisons with the different approach taken by the EU AIA. Where the UK intends to take a sector-by-sector approach to regulating AI, the EU has opted for a horizontal cross-sector regulation-led approach. Further, the EU clearly intends exactly the same single set of rules to apply EU-wide. The EU AIA is framed as a directly-effective Regulation whereby the EU AIA applies directly as law across the bloc, rather than the ‘EU Directive’ method, which would require Member States to develop domestic legislation to comply with the adopted framework.

The EU and UK approaches each have potential benefits. The EU’s single horizontal approach of regulation across the bloc ensures that organisations engaging in regulated AI activities will, for the most part, only be required to understand and comply with the AI Act’s single framework and apply a common standard based on the use to which AI is being put, regardless of sector.

The UK’s approach provides a less certain legislative framework, as companies may find that they are regulated differently in different sectors. While this should be mitigated through the ‘coordination layer’, it will likely lead to questions about exactly what rules apply when, and the risk of conflicting areas of regulatory guidance. This additional complexity will no doubt be a potential detractor for the UK, but if adopted effectively the benefits of having a regime that is agile to evolving needs and technologies, could trump the EU with its more codified approach. In theory, it should be much easier for the UK to implement changes via regulatory standards, guidance, or findings than it would be for the EU to push amendments through a relatively static legislative process.

US Approach

There are clear parallels between the UK and the likely direction of travel in the US, where a sector-by-sector approach to the regulation of AI is the preferred choice. In October 2022, the White House Office of Science and Technology Policy published a Blueprint for an AI Bill of Rights (“Blueprint”). Much like the Paper, the Blueprint sets out an initial framework for how US authorities, technology companies, and the public can work to ensure AI is implemented in a safe and accountable manner. The US anticipate setting out principles that will be used to help guide organisations to manage and (self-) regulate the use of AI, but without the level of directional control that the UK anticipate passing down to sector specific regulators. Essentially the US position will be to avoid direct intervention into state or federal level regulations which will be left to others to decide. It remains to be seen how the concepts framed in the Blueprint might eventually translate into powers for US regulators.

A Push for Global Interoperability

While the Government seeks to capitalise upon the UK’s strategic position as third in the world for number of domestic AI companies, it also recognises the importance of collaboration with international partners. Focus moving forward will therefore be directed to supporting global opportunities while protecting the public against cross-border risks. The Government intends to promote interoperability between the UK approach and differing standards and approaches across jurisdictions. This will ensure that the UK’s regulatory framework encourages the development of a compatible system of global AI governance that will allow organisations to pursue ventures across jurisdictions, rather than being isolated by jurisdiction-specific regulations. The approach is expected to leverage existing proven and agreed upon assurance techniques and international standards play a key role in the wider regulatory ecosystem. Doing so is therefore expected to support cross-border trade by setting out internationally accepted ‘best practices’ that can be recognised by external trading partners and regulators.

Next steps

The Government acknowledges that AI continues to develop at pace, and new risk and opportunities continue to emerge. To continue to strengthen the UK’s position as a leader in AI, the Government is already working in collaboration with regulators to implement the Paper’s principles and framework. It anticipates that it will continue to scale up these activities at speed in the coming months.

In addition to allowing for responses to their consultation (until 21 June 2023), the Government has staggered its next steps into three phases: i) within the first 6 months from publication of the Paper; ii) 6 to 12 months from publication; and iii) beyond 12 months from publication.

Find out more

You can find out more on AI and the law and stay up to date on the UK’s push towards regulating AI at Technology’s Legal Edge, DLA Piper’s tech-sector blog.

For more information on AI and the emerging legal and regulatory standards, visit DLA Piper’s focus page on AI.

You can find a more detailed guide on the AI Act and what’s in store for AI in Europe in DLA Piper’s AI Regulation Handbook.

To assess your organisation’s maturity on its AI journey (and check where you stand against sector peers) you can use DLA Piper’s AI Scorebox tool.

DLA Piper continues to monitor updates and developments of AI and its impacts on industry across the world. For further information or if you have any questions, please contact the authors or your usual DLA Piper contact.

]]>
EU: Final version of the EDPB-Guidelines 05/2021 on the Interplay between the application of Art. 3 and the provisions on international transfers as per Chapter V of the GDPR https://privacymatters.dlapiper.com/2023/03/eu-final-version-of-the-edpb-guidelines-05-2021-on-the-interplay-between-the-application-of-art-3-and-the-provisions-on-international-transfers-as-per-chapter-v-of-the-gdpr/ Tue, 07 Mar 2023 08:27:55 +0000 https://blogs.dlapiper.com/privacymatters/?p=3792 Continue Reading]]> Authors: Andreas Rüdiger, Philipp Adelberg

 On 14 February 2023, the European Data Protection Board (“EDPB”) published the updated and final version of its Guidelines 05/2021 on the Interplay between the application of Article 3 and the provisions on international transfers as per Chapter V of the GDPR (EDPB Guidelines 05/2021). In comparison to the first version of the guidelines published in 2021, the core messages of the paper remain the same. The EDPB sets out three essential criteria for qualifying a processing of personal data as a transfer to a third country. In the update to its guidelines, the EDPB now specifies these requirements in more concrete terms.

 Transfer to a third country

 Since the GDPR itself does not provide for a definition of the term “transfer of personal data to a third country or to an international organisation” and case law only exists to a limited extent in this regard, the EDPB elaborates three cumulative criteria to qualify a processing operation as a transfer:

  1. the controller/processor (“exporter”) is subject to the GDPR for the given processing,
  2. the exporter discloses by transmission or otherwise makes personal data, subject to this processing, available to another controller, joint controller or processor (“importer”), and
  3. the importer is located in a third country, irrespective of whether or not this importer is subject to the GDPR for the given processing in accordance with Art 3 GDPR or is an international organisation.

If one of these criteria is not met, the respective processing activity cannot be considered a transfer within the meaning of the GDPR.

Even if in such cases the provisions of Chapter V of the GDPR do not apply, the EDPB expressly points out that the controller must nevertheless comply with the other provisions of the GDPR and remains fully accountable for its processing activities, regardless of where they take place as they may be associated with certain risks if they take place outside of the EU (e.g. where an employee of an EU controller travels abroad and has access to the data of that controller while being in a third country). This risk may perhaps arise from conflicting national law or disproportionate access rights for the authorities of the third country. A controller must take these risks into account accordingly when initiating a transmission of personal data and take appropriate data security measures.

In this regard, the Committee of Independent German Federal and State Data Protection Supervisory Authorities (Datenschutzkonferenz – “DSK“) – a board consisting of the federal and state data protection supervisory authorities dealing with and commenting on current data protection issues in Germany stated in its resolution from 31 January 2023 on the assessment of access possibilities to personal data for public authorities of third countries under current data protection law (DSK-resolution of January 31, 2023), that the mere risk that public authorities of third countries might request a transmission of personal data to the third country is not sufficient to assume a transfer of data within the meaning of Art. 44 et seqq. GDPR per se.

Both the EDPB and the DSK provide examples of security measures to be taken in such cases. These include, among other things, the implementation of appropriate technical and organizational measures as well as a detailed examination of the law of the third country, any assurances given by the contractual partner and the possibility of complying with them, and the assessment of other risks associated with the transmission.

Specification of the criteria

The EDPB now specifies the above criteria in its second version of the Guidelines 05/2021. For this purpose, the EDPB also elaborates extensive examples of application, which illustrate the interplay between Art. 3 GDPR and Chapter V of the GDPR.

Of particular relevance is the clarification that it is sufficient for making available personal data (criterion 2) if, for example, personal data is accessed remotely from a third country or is stored in a cloud outside the European Economic Area (EEA), and the other criteria are met as well. However, if the data processing is of solely internal nature to the controller, i.e. when the data is not being transferred to another controller or a processor and therefore does not leave the organizational structure of the controller, personal data is not “made available” to another controller/processor. This is illustrated in the example 8.1 in the Guidelines 05/2021.

The twelfth and last example elaborated by the EDPB is also of high importance to the data protection practice. In this scenario, a controller based in the EU engages a processor who is also based in the EU but is a subsidiary of a company based in a third country. Understandably, the EDPB does not consider the transfer of personal data by the controller to the processor as a third country transfer. However, this constellation becomes problematic in cases where the processor, in its function as a subsidiary, is also subject to the laws of the third country in which the parent company is located with extraterritorial effect. This may result in authorities of the third country requesting that the personal data processed by the processor on behalf of the controller is being transmitted to the respective authority in accordance with applicable local law of the third country. If the processor complies with this and transmits the data to authorities in the third country, the EDPB considers this to be a third country transfer. If the controller has prohibited such a transfer in the data processing agreement, the processor acts contrary to the instructions of the controller and is itself considered to be the controller for this processing operation pursuant to Art 28 (10) GDPR. The controller is obliged to check in advance whether the commissioned processors are subject to such access rights of third country authorities and, if necessary, to take appropriate technical and organizational measures to ensure that the processing is also carried out in accordance with the provisions of Chapter V of the GDPR.

Conclusion

The legally non-binding Guidelines 05/2021 of the EDPB are to be welcomed insofar as they show in a comprehensible and easy-to-use manner in which constellations a third country transfer is to be assumed within the meaning of the GDPR, in particular taking into account the regulations on the territorial scope of application according to Art 3 GDPR. In addition, they illustrate that there may nevertheless be risks of violations of the GDPR by data controllers in cases in which a data flow to a third country does not qualify as a third country transfer. This constitutes in our opinion a rather abstract risk and shall not lead to equal risk assessment obligations for a controller as for actual third country transfers. However, given the complexity and multi-layered nature of the possible constellations of processing operations, companies are well advised to carefully examine the extent to which personal data is transferred to a third country when involving additional controllers or processors in order to consider and implement appropriate security measures and avoid potential fines. Finally, it is pleasing to note the EDPB’s clarification regarding the fact that the transfer of personal data by a processor based in the EU to an authority in a third country may be contrary to instructions and will, if so, qualify the processor itself a controller under Art. 28 (10) of the GDPR.

More on how to deal with third country transfers and detailed information on DLA Piper’s legal tech tool “Transfer” can be found here.

]]>
Australia: Cyber security round-up – new Cyber Security Strategy, data breach stats and more https://privacymatters.dlapiper.com/2023/03/australia-cyber-security-round-up-new-cyber-security-strategy-data-breach-stats-and-more/ Fri, 03 Mar 2023 12:34:21 +0000 https://blogs.dlapiper.com/privacymatters/?p=3785 Continue Reading]]> Author: Sarah Birkett

Cyber Security Strategy discussion paper launched

This week saw the launch of a discussion paper for the Australian Government’s 2023-2030 Australian Cyber Security Strategy. The discussion paper refers to the lofty aim of making Australia the most cyber secure nation by 2030.

The discussion paper, which acknowledges that the Australian Government was “ill-equipped” to respond to the large scale data breaches which occurred in 2022 (namely Medibank and Optus), emphasises the importance of protecting customer data and enduring that Australians can continue to access critical services in the event of a cyber-attack.

One of the core policy areas that will be addressed in the Strategy is the “enhancement and harmonisation of regulatory frameworks”.  Several options are being considered to give effect to this, including:

  • Development of best practice cyber security standards.
  • New laws, such as a Cyber Security Act, to provide a more explicit specification of cyber security obligations;
  • Expansion of the existing Security of Critical Infrastructure Act to include customer data and systems within the definition of critical assets. This proposal is particularly controversial given the power for the Australian Signals Directorate to “step-in” and control critical assets as a measure of last resort under that Act; and
  • A single reporting portal for all cyber incidents, to harmonise the existing requirements to report separately to multiple regulators.

Additional policy areas identified for further consideration in the discussion paper include:

  • Developing national frameworks to respond to major incidents, including the development of fit-for-purpose approaches to incident management and coordination and ensuring that post-incident reviews of major incidents are conducted and root cause findings shared.
  • Designing and sustaining security in new technologies, such as quantum computing, IoT and AI, each of which have the potential to significantly impact, and be impacted by, cyber security issues.
  • Supporting Australia’s cyber security workforce and skills pipeline.

The Strategy is expected to be finalised by the end of 2023.  An Expert Advisory Board has been established to assist with development of the Strategy, and is inviting consultations on the areas outlined in the discussion paper until 15 April 2023.

Establishment of Cyber Security Coordinator to assist with coordinated responses to cyber attacks

Since the release of the discussion paper, the Federal Government has announced its intent to establish a national Coordinator for Cyber Security.

The Coordinator will form part of a broader National Office for Cyber Security and will be responsible for ensuring a “centrally coordinated approach” to cyber security, including coordination of major incidents.

Latest data breach statistics show that data breaches are on the rise

The launch of the cyber security discussion paper coincides the with publication of the Office of the Australian Information Commissioner’s latest statistics on the notifiable data breach regime.

These statistics confirm the commonly held view that data breaches are on the rise in Australia.

The 6 month period from July – December 2022 saw a 26% increase in the number of data breaches reported against the previous 6 month period.  For breaches caused by criminal or malicious attacks, the increase was 46% for the same period.  Health care and financial services remain the two highest reporting sectors.

Significantly there were five breaches which impacted more than 1 million Australians –with one impacting more than 10 million. Whilst the high-profile incidents affecting Optus and Medibank account for two of these incidents, these statistics highlight that several major data breaches have gone unreported in Australia.

]]>
EU – US adequacy decision: Update https://privacymatters.dlapiper.com/2023/03/eu-us-adequacy-decision-update/ Thu, 02 Mar 2023 16:07:22 +0000 https://blogs.dlapiper.com/privacymatters/?p=3784 Continue Reading]]> Authors: Andreas Rüdiger, Philipp Adelberg

The debate on transatlantic data transfers, a possible adequacy decision for the US and the EU-US Data Privacy Framework (“DPF“) is gaining new momentum. On 14 February 2023, the European Parliament’s Committee on Civil Liberties, Justice and Home Affairs published its draft motion for a resolution regarding the adequacy of the protection of personal data under the DPF (to be found under RD_Statements (europa.eu)). Two weeks later, on 28 February 2023, the European Data Protection Committee (“EDPB“) published its Opinion 5/2023 on the Commission’s draft adequacy decision based on the DPF (to be found under EDPB welcomes improvements under the EU-U.S. Data Privacy Framework, but concerns remain | European Data Protection Board (europa.eu)). The DPF is intended to replace the US Privacy Shield, which was declared invalid by the ECJ’s “Schrems II” decision (C-311/18), and to facilitate data transfers from Europe to the United States. However, the opinions of the Parliament and the EDPB differ on the question of the adequate level of data protection of the DPF.

 Draft motion for a resolution 2023/2501 (RSP)

In its draft motion for a resolution, the Parliament calls on the Commission to continue negotiations with the US regarding the DPF in order to establish an equivalent level of protection for personal data in the EU and the US. In its recitals for this request, the Parliament concludes that the DPF fails to establish an equivalent level of data protection.

Referring to the history of previous efforts regarding the approval of personal data transfer to the US, the legislation as well as the ECJ’s case law, the Parliament stresses the fundamental importance of  personal data transfer, especially from an economic and innovative point of view. At the same time, the Parliament warns against a far-reaching restriction of data subjects’ fundamental rights.

The European Parliament criticizes the current legal framework in the US regarding the protection of personal data. Such critic is not only based on the lack of uniform data protection laws at the US federal level or the fundamentally different understanding of data protection principles compared to the position of the European legislator. While the Parliament welcomes the efforts of the US to adapt the data protection regime in its own territory in the form of an Executive Order 14086 (“EO“), it also criticizes the EO for being unclear, imprecise, and unpredictable, as it can be amended by the US President at any time. Moreover, the Parliament is also skeptical about the new options for legal protection available to data subjects.

However, regardless of the substantial criticisms made by the Parliament at this point, the potential legal effects (or lack thereof) coming from such a motion for a resolution must be taken into account. As it is the preparatory step towards the completion of a resolution, which itself is not legally binding (cf. Art. 288 para. 5 TFEU), the effects of such measures remain limited to the potential influence on EU legislation.

EDPB-Opinion 5/2023

At the Commission’s request, the EDPB has assessed the adequacy of the level of data protection in the US based on the DPF. While this Opinion is not legally binding, it is nevertheless a procedural requirement for the Commission’s adoption of a decision due to the EDPB’s involvement in the decision-making process at the Commission’s request.

Overall, the EDPB is of the opinion that the EO, as part of the DPF, leads to significant improvements in level of protection for personal data compared to the US Privacy Shield. This opinion relates in particular to the introduction of the principles of necessity and proportionality as well as the listing of specific purposes for which data processing may take place. Additionally, the EDPB commends the new individual remedy for EU data subjects in case of data processing in breach of the rules of the DPF.

Still, the EDPB continues to raise concerns which should be addressed to begin with in order to provide further clarity and provide a solid foundation for a possible adequacy decision. These concerns relate in particular to data subjects’ rights (e.g., some exceptions to the right of access and the time limits and modalities for the right to object), the lack of key definitions, the lack of clarity regarding the application of the DPF to processors, and the broad exception for publicly available information.

Outlook:

In summary, the chances for an adequacy decision by the Commission being adopted based on the DPF are good in our opinion. The motion of 14 February 2023 preparing a resolution of the Parliament does not prevent this due to its lack of legally binding effects. However, the EDPB’s opinion, which is also not legally binding, is of greater importance for the debate insofar as the opinion was requested by the Commission and can be expected to have a significant influence on the further steps towards an adequacy decision. The EDPB sees substantial improvements compared to predecessor regulations and does not expect the DPF to be an exact replication of European data protection law. The remaining concerns can be addressed by further creation of transparency. The EDPB’s opinion has already been endorsed by some of the German data protection supervisory authorities, who declare the now expected adequacy decision to principally be a success for data protection (for example, the Hamburg Commissioner for Data Protection and Freedom of Information: Assessment of the adequacy decision for the US. Thomas Fuchs: “Die Wahrheit ist auf dem Platz” (datenschutz-hamburg.de)). The ball is now in the Commission’s court to consider the EDPB’s concerns and, if necessary, address them directly to the US. If the adoption of a possible adequacy decision will still be able to occur in a timely manner due to the given reasons, remains to be seen. Nevertheless, it is clear that the Commission is proactively addressing this issue.

]]>
Australia Privacy Act review – a blueprint for change? https://privacymatters.dlapiper.com/2023/02/australia-privacy-act-review-a-blueprint-for-change/ Mon, 20 Feb 2023 08:49:42 +0000 https://blogs.dlapiper.com/privacymatters/?p=3765 Continue Reading]]> Authors: Sarah Birkett, Nicholas Boyle

The Australian Attorney-General has published the (long-awaited) results of the Privacy Act review.

The report recommends a number of changes to the Australian privacy framework, including various changes to Australia’s core privacy legislation, the Privacy Act 1988 (Cth).

The report does not represent official Government policy and there is no guarantee that the proposed changes will eventually make their way into law.  However Australian businesses should start preparing for these changes, particularly given the level of bipartisan support for privacy reform following several large-scale data breaches in 2022.

What changes are proposed?

Broadly the structure of the Privacy Act will remain unchanged, despite the number of recommended changes identified.  Notably, the Australian Privacy Principles will not be supplemented with more precise rules governing data processing activities.

Some of the proposals can be viewed as clarifications rather than substantive changes, including calls for expanded guidance notes from Australia’s privacy regulator, the Office of the Australian Information Commissioner.

However there are a number of recommendations which, if implemented, will materially change the way in which Australian organisations approach privacy compliance.  For example:

  • A significant expansion of data subject rights, with many concepts borrowed from other regimes such as the GDPR, including the right of erasure, right to withdraw consent, right to object to the collection, use or disclosure of personal information and the right to de-index online search results containing certain categories of personal information.
  • Introduction of a direct right of action for individuals, for a serious interference with privacy, plus a statutory tort of privacy.
  • More structured processes around direct marketing, tracking and trading in personal information, including an unqualified right to opt-out of receiving targeted advertising.
  • A partial removal of the exemption for employee records, with limited obligations applying to HR data such as the requirement to keep data secure and notify staff of relevant data breaches.
  • Greater transparency around privacy policies and collection notices, with additional data points to be included and calls for development of standardised templates and layouts on a sector-by-sector basis, to make it easier for data subjects to understand and compare policies.
  • Updating the basis on which offshore transfers can be made, including where Standard Contractual Clauses are used, where informed consent has been obtained or where an adequacy decision is in place.
  • Removal of the exemption for small businesses (i.e. with an annual turnover of AUD 3 million or less), which will materially increase the number of organisations required to comply with the Privacy Act, although this has been flagged as requiring further consultation.
  • For organisations which process the personal information of minors, a suite of changes including development of a Children’s Online Privacy Code and a prohibition on direct marketing to children unless certain conditions are met.

What are the next steps?

It’s yet to be seen how the Australian Government will respond to the review, and whether it will accept the recommendations made.

The report itself notes that some proposals have not had the benefit of stakeholder feedback and will require further consultation prior to implementation.  Therefore it’s likely to be some time before the changes can be adopted in full (if indeed they are adopted at all).

In the interim, there are changes which Australian businesses can make to their processes now, to reduce the impact if and when these recommendations are adopted.

For further information, please contact Sarah Birkett or Nick Boyle.

]]>
US: Google to pay $29.5 million to Indiana and District of Columbia to settle location privacy suits https://privacymatters.dlapiper.com/2023/01/us-google-to-pay-29-5-million-to-indiana-and-district-of-columbia-to-settle-location-privacy-suits/ Fri, 13 Jan 2023 15:47:22 +0000 https://blogs.dlapiper.com/privacymatters/?p=3752 Continue Reading]]>

The following day, then-DC Attorney General Karl Racine announced a similar settlement agreement. In the two settlements, Google agreed to pay Indiana and the District of Columbia $29.5 million, collectively ($20 million and $9.5 million, respectively). These settlements follow similar settlements last year with 40 US state attorneys general and with Australian regulators.

The settlements highlight government expectations that companies obtain proper consents, including robust disclosures of data practices, for sensitive personal information such as location information.

Regulatory and litigation history

Google provides several apps and platforms that collect user location information, particularly from mobile devices, such as through Google Search and Google Maps. Google has used this information to support its business operations in several ways, including by disclosing user location information to other businesses, e.g., to learn how digital advertising can encourage people to visit brick-and-mortar stores. Following news reports in 2018, state attorneys general, including Attorneys General Rokita and Racine, alleged that Google collected location information from users without their consent, including by misleading users to falsely believe that certain settings limited location data collection.

These allegations included:

  • Deceiving consumers regarding their ability to protect their privacy through Google Account Settings
  • Misrepresenting and omitting material facts regarding the Location History and Web & App Activity Settings
  • Misrepresenting and/or omitting material facts regarding consumers’ ability to control their privacy through Google Account Settings
  • Misrepresenting and omitting material facts regarding the Google Ad Personalization Setting
  • Deceiving consumers regarding their ability to protect their privacy through device settings and
  • Deploying deceptive practices that undermine consumers’ ability to make informed choices about their data, including dark patterns.

Key takeaways

Pursuant to the settlements, in addition to the payments, the company must make prominent disclosures about its data practices prior to obtaining consent to collect location information, provide users with additional account controls, and introduce limits to its data use and retention practices. Certain aspects of the settlements deserve particular attention:

  • The settlement requires Google to issue notices to users who allow certain location tracking settings through Google services or devices, including via pop up notifications and email, that disclose whether their location information is being collected and provide instructions on how to limit collection and delete collected location information. Google is also required to notify users via email of any material changes in its privacy policy about the collection, use, and retention of user location information.
  • Google must establish and maintain a “location technologies” webpage that discloses Google’s location data policies and practices as well as how users can limit collection of, and delete collected, location information. Google must also provide a hyperlink to this webpage, in its privacy policy, in the account creation flow, and whenever users enable or are prompted to enable a location-related account setting while using a Google product.
  • The settlement requires Google to implement more specific language in a few places:
    • Settings webpage, about location information: “Location info is saved and used based on your settings. Learn more.”
    • Location technologies webpage, about ads: That users cannot prevent the use of location information in personalized ads across services and devices, based on user activity on Google services, including Google Search, YouTube, and websites and apps that partner with Google to show ads.
  • Google may only share a user’s precise location information with a third-party advertiser with that user’s express affirmative consent for use and sharing by that third party.
  • Google must conduct internal privacy impact assessments before implementing any material changes of how certain settings pages impact precise location information or how Google shares users’ precise location information related to such settings.

While there are many notable aspects to these settlements, it is also notable that this occurred as many states are beginning to implement new privacy laws and regulations, which include increased business obligations for the collection, use, and disclosure of sensitive personal information, such as location information.

See the Indiana AG and District of Columbia AG press releases here (IN) and here (DC).  Find out more about the implications of these developments by contacting either of the authors.

]]>
UK: Data adequacy post-Brexit – the UK’s first ‘data bridge’ https://privacymatters.dlapiper.com/2023/01/uk-data-adequacy-post-brexit-the-uks-first-data-bridge/ Wed, 04 Jan 2023 16:09:14 +0000 https://blogs.dlapiper.com/privacymatters/?p=3741 Continue Reading]]> Author: James Clark

On 19 December 2022 the UK government’s first data adequacy decision of the post-Brexit era came into effect. Under the Data Protection (Adequacy) (Republic of Korea) Regulations 2022, the UK formally determined that the Republic of Korea provides an adequate level of data protection for the purposes of the UK GDPR. Consequently, UK businesses can now freely transfer personal data to recipients in South Korea without needing to take any additional steps (such as entering into standard contractual clauses or carrying out transfer impact assessments).

The UK’s decision was expected, as the European Commission had already granted the Republic of Korea an adequacy decision under EU GDPR back in December 2021. However, the UK’s decision – which it is referring to as a ‘data bridge’ – is broader than the EU decision, as it extends to personal data that benefits from exemptions from South Korea’s primary data protection law, the Korean Personal Information Protection Act.

How did we get here?

Under the GDPR, transfers of personal data to ‘third countries’ are prohibited, unless one of the conditions set out in Chapter V of the GDPR is met. The most favourable condition is that an ‘adequacy decision’ exists for the third country (under Article 45 GDPR), which means that the third country is deemed to provide an equivalent level of data protection (taking into account factors such as the rule of law and fundamental privacy safeguards, in addition to personal data protection laws). Where an adequacy decision exists, personal data can move freely to the third country without any additional steps being required.

Prior to Brexit, the UK, in common with all other Member States, relied on the European Commission to determine adequacy decisions for third countries. Post-Brexit, when the UK created its own parallel version of the GDPR, the power to determine adequacy decisions was transferred to the Secretary of State (at the same time as the existing EU adequacy decisions were grandfathered into UK law on a temporary basis). The Korea data bridge is the first adequacy decision made by the Secretary of State under the UK GDPR.

What does the decision cover?

The data bridge covers any transfer of personal data to a person in the Republic of Korea who is subject to the PIPA. The PIPA is a general and comprehensive data protection statute which is broadly analogous to the GDPR.

Unlike the EU decision, the UK data bridge also encompasses transfers of personal credit information to persons in the Republic of Korea who are subject to the Use and Protection of Credit Information Act, which provides specific rules applicable to organisations in the financial sector when they process personal credit information.

What can we expect from future data bridges?

The UK government has indicated that it has ambitious plans for data bridges. It believes that “global networks of personal data flows are critical to the UK’s prosperity and modern way of life”, and it wants to use data bridges as a mechanism to “remov[e] unnecessary barriers to cross-border data flows”. Under its ‘Data: A New Direction’ strategy, the UK has selected the following countries as its ‘top priorities’ for an adequacy decision:

  • Australia;
  • Colombia;
  • Dubai International Financial Centre;
  • Singapore; and
  • the United States of America.

In addition, the following countries represent the UK’s longer-term priorities:

  • India;
  • Brazil;
  • Indonesia; and
  • Kenya.

Given that the EU is on the cusp of securing a partial adequacy decision for the United States through its ‘EU-US Data Privacy Framework’, the UK’s next steps for that country – which is so crucial when it comes to the IT infrastructure of UK businesses – will be closely watched. In particular, it will be interesting to see whether the UK puts in place a data bridge with the same scope as the EU deal, or whether the UK tries to do something more ambitious – a data bridge with a broader scope, as has been concluded with Korea – on either an immediate or longer-term basis.

Finally, now that the UK is no longer subject to the jurisdiction of the Court of Justice of the European Union – something that wasn’t the case when the Schrems II judgment was handed down in 2020 – it is important to note that any challenges to a UK-US data bridge (or any other UK data bridge, for that matter) by privacy activists will be conducted separately from challenges to the EU-US decision, and will proceed through UK, rather than European, courts.

]]>