| Privacy Matters DLA Piper's Global Privacy and Data Protection Resource Thu, 05 Dec 2024 09:38:46 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif | Privacy Matters 32 32 Australia: Privacy Act amendments and Cyber Security Act become law https://privacymatters.dlapiper.com/2024/12/australia-privacy-act-amendments-and-cyber-security-act-become-law/ Thu, 05 Dec 2024 09:37:47 +0000 https://privacymatters.dlapiper.com/?p=7512 Continue Reading]]> On 29 November 2024, the Australian Senate passed the Privacy and Other Legislation Amendment Bill 2024 (Cth) (the Privacy Act Bill).  This follows the passage of the Cyber Security Act 2024 (Cth), and other cyber-security related amendments, on 25 November 2024.  

The majority of the amendments to the Privacy Act 1988 (Cth) will commence the day after the Privacy Act Bill receives Royal Assent, with a few exceptions.

The Privacy Act Bill contains key amendments to the Privacy Act including:

  • A statutory tort for serious invasions of privacy – this will only apply (amongst other criteria) where the conduct in question was intentional or reckless, and this section of the Bill will take effect no later than six months after the Act receives Royal Asset.
  • The framework for a Children’s Online Privacy Code – this will be developed by the Information Commissioner and will apply to social media platforms and any online services likely to be accessed by children.
  • Tiered sanctions for less serious privacy breaches – this includes civil penalties of up to AUD 3.3 million for an “interference with privacy” and lower level fines of up to AUD 330,000 for administrative breaches, such as deficient privacy policies.  The headline penalties of up to the greater of AUD 50 million, three times the benefit of a contravention, or 30% of annual turnover, remain for conduct which amounts to a “serious interference with privacy”.
  • Requirements to include details of the use of automated decision making into privacy policies, where personal information is used in wholly or substantially automated decision making that could reasonably be expected  to significantly affect the rights or interests of an individual.  This requirement will not take effect for 24 months however.
  • The introduction of a criminal offence for doxing.
  • Eligible data breach declarations and information sharing – these are designed to allow limited information sharing following a data breach, in circumstances which would otherwise be in breach of the Privacy Act (such as disclosing information to banks and other institutions for the purpose of enhanced monitoring).
  • Clarifications to APP 11 to ensure it is clear that the reasonable steps which entities must take to protect personal information include “technical and organisation measures”.
  • The introduction of equivalency decisions under APP 8 to facilitate cross-border transfers of data.

Our previous post, available here, provides further insights regarding these changes.

Whilst the Privacy Act Bill implements some of the recommendations from the Privacy Act Review Report, subsequent tranches of amendments are expected in the next 12-18 months to implement the remaining recommendations.

The Cyber Security Act 2024 (Cth), which received Royal Asset on 29 November 2024, introduces:

  • A mandatory ransomware reporting requirement – reports must be made to the Department of Home Affairs if a ransomware payment is paid to an extorting entity. This requirement will be implemented after a 6 month implementation period, and is drafted so as to also capture ransomware payments made on behalf of an entity doing business in Australia.
  • A Cyber Review Board which will conduct no-fault, post incident reviews of significant cyber security incidents in Australia.
  • A limited use exception –  this prevents information which is voluntarily provided to certain Government departments from being used for enforcement purposes, and is designed to encourage enhanced cooperation between industry and Government during cyber incidents.
  • Mandatory security standards for smart devices.

Our previous post, available here, includes further details on cyber security legislative package.

]]>
Australia: In-Store Facial Recognition Tech Breached Privacy Act https://privacymatters.dlapiper.com/2024/11/australia-in-store-facial-recognition-tech-breached-privacy-act/ Fri, 22 Nov 2024 09:14:22 +0000 https://privacymatters.dlapiper.com/?p=7509 Continue Reading]]> “Ethically challenging” and “the most intrusive option” – these are some of the words Australia’s Privacy Commissioner used to describe facial recognition technology (FRT), and its use by national hardware retailer Bunnings.

The Office of the Australian Information Commissioner (OAIC) has released the findings of its much-awaited investigation into the use of FRT in at least 62 Bunnings stores in Victoria and New South Wales between November 2018 and November 2021. FRT was used to, as Bunnings submitted, monitor and identify individuals known by the retailer to engage in antisocial behaviour in its stores.

The investigation was sparked by consumer advocate group Choice, which flagged concerns about the use of FRT by Bunnings and other retailers in 2022. Facial recognition technology collects biometric information about an individual. Biometric information is sensitive information, which is entitled to specific protections under Australia’s overarching privacy law, the Privacy Act 1988 (Cth) (Privacy Act). Choice took the view that sensitive personal information was being collected via in-store FRT without sufficient notice to customers, and that the collection was “disproportionate” to legitimate business functions.

The OAIC’s investigation has affirmed these concerns.

Key Findings

Bunnings breached the Australian Privacy Principles (APPs) in the Privacy Act by unlawfully interfering with the privacy of individuals whose personal and sensitive information it collected through the FRT system.

  • Lack of Consent: Sensitive information was collected without consent, breaching APP 3.3, which prohibits such collection unless specific consent is given (or an exception applies, which it did not in this case).
  • Failure to Notify: Bunnings did not adequately inform individuals about the collection of their personal information. This was a breach of APP 5.1, which requires entities to notify individuals about certain matters regarding their personal information as it is collected.
  • Inadequate Practices and Policies: Bunnings failed to implement proper practices, policies, and procedures to ensure compliance with the APPs, breaching APP 1.2.
  • Incomplete Privacy Policies: Bunnings’ privacy policies did not include information about the kinds of personal information it collected and held, and how, breaching APP 1.3.

The OAIC has emphasised that entities using FRT must be transparent, and ensure individuals can provide informed consent.

Along with the outcome of the investigation, the regulator has also issued specific guidance on the use of FRT, stating, “the use of facial recognition technology interferes with the privacy of anyone who comes into contact with it,” and that convenience is not a sufficient justification for its use. Businesses must consider five key principles when looking to employ FRT: 1) privacy by design; 2) necessity and proportionality; 3) consent and transparency; 4) accuracy and bias; and 5) governance and ongoing assurance.

What’s Next for Bunnings?

Bunnings had already paused its use of FRT. As a result of its investigation, the OAIC has made declarations that Bunnings:

  • Not repeat or continue the acts and practices that led to the interference with individuals’ privacy.
  • Publish a statement about the conduct.
  • Destroy all personal information and sensitive information collected via the FRT system that it still holds (after one year).

This decision aligns with the continued emphasis on privacy rights in Australia. As we await further legislative updates to the Privacy Act in the new year, businesses operating in Australia will need to apply greater scrutiny to the security and privacy practices adopted in respect of consumers.

]]>
Australia: Privacy Act Updates Expected in August 2024 https://privacymatters.dlapiper.com/2024/05/australia-privacy-act-updates-expected-in-august-2024/ Mon, 13 May 2024 08:40:15 +0000 https://privacymatters.dlapiper.com/?p=7322 Continue Reading]]> The next steps in Australia’s long bubbling reform of the privacy regime has been announced, with draft legislation expected to be tabled by August 2024. The reform is being presented as part of the Federal Government’s efforts to improve online safety, particularly for women, but it’s not clear how broad its remit will be at this stage.

Of the 116 recommendations for reform made by the Attorney-General’s Department in 2023, 38 were accepted in full by the Federal Government, and a further 68 accepted in principle, where more extensive consultation is required.

We are expecting all 38 of the “accepted in full” changes to be implemented in the August bill, which includes:

  • changes to the civil penalty regime, to introduce low, medium and high tiers, based on the severity of the breach, to allow for more targeted enforcement;
  • a requirement for privacy policies to include details of any personal information used in substantially automated decisions with legal or other significant effects;
  • a right for individuals to request meaningful information about how substantially automated decisions with legal or other significant effects are made; and
  • a Children’s Online Privacy Code, for online services likely to be accessed by individuals under the age of 18.

We don’t know at this stage how many of the “agree in-principle” reforms will be tabled in August, however in its messaging regarding the issue of online safety and the link with privacy reform the Federal Government has highlighted:

  • the introduction of a statutory tort for serious invasions of privacy; and
  • expanding data subject rights beyond access and correction, to include a right of erasure, and a right to de-index certain online search results.

One issue which has been repeatedly highlighted is the need to offer protection against doxxing (i.e. the release of personal information with an intent to cause harm), as well as the wish to offer women suffering domestic and family violence “greater control and transparency over their personal information.”

Australia’s Attorney-General recently confirmed his views that the current regime is “woefully outdated and unfit for the digital age,” with “speed of innovation and the rise of artificial intelligence” underlining the need for reform.

We’ll provide further updates once more information about the August bill is available.

]]>
US: New Jersey Enacts Comprehensive State Privacy Law https://privacymatters.dlapiper.com/2024/02/us-new-jersey-enacts-comprehensive-state-privacy-law/ Tue, 13 Feb 2024 16:27:52 +0000 https://privacymatters.dlapiper.com/?p=7227 Continue Reading]]> On January 16, 2023, the New Jersey Governor signed into law Senate Bill 332 (the “Act”) making New Jersey the 14th state to adopt a comprehensive state privacy law. The Act will take effect on January 15th, 2025, and requires the Division of Consumer Affairs to issue rules and regulations to effectuate the Act; however, the Act does not specify a set timeline for establishing such regulations.

Regulated Entities

The Act applies to entities that conduct business in New Jersey or produce products or services that are targeted to New Jersey residents, and that during a calendar year meet one of the following criteria:

  • control or process the personal data of at least 100,000 New Jersey consumers; or
  • control or process the personal data of at least 25,000 New Jersey consumers and derive revenue, or receive a discount on the price of any goods or services, from the “sale” of personal data.

Unlike many other comprehensive state privacy laws, the Act does not contain an exemption for nonprofits.[1] It does, however, exempt “financial institutions” that are subject to the Gramm-Leach-Bliley Act.  On the other hand, the Act (similar to the CCPA) only exempts “protected health information collected by a covered entity or business associate” subject to HIPAA but does not exempt covered entities (or business associates) in their entirety.  Like most state comprehensive privacy laws, the Act also contains some limited exemptions for personal data subject to certain federal privacy laws and regulations, including (1) personal data sold pursuant to the Drivers’ Privacy Protection Act of 1994, (2) personal data collected, processed, sold, or disclosed by a consumer reporting agency in compliance with the Fair Credit Reporting Act, and (3) personal data collected, processed, or disclosed as part of clinical research conducted in accordance with U.S. federal policy (45 C.F.R. Part 46) or FDA regulations (21 C.F.R. Parts 50 and 56) for the protection of human subjects in clinical research.

Key Definitions

For the most part, the definitions under the Act align to those of existing state comprehensive privacy laws.

Consumer: A “consumer” is “an identified person who is a resident of [New Jersey] acting only in an individual or household context.” As with majority of the other state comprehensive privacy laws (not including the California Consumer Privacy Act or “CCPA”), the Act expressly excludes “a person acting in a commercial or employment context.”

Personal Data: Under the Act“personal data” includes “any information that is linked or reasonably linkable to an identified or identifiable person. . . not [including] de-identified data or publicly available data.”

Profiling: Under the Act, “profiling” means “automated processing” of personal data “to evaluate, analyze or predict. . . an identified or identifiable individual’s economic situation, health, personal preferences, interests, reliability, behavior, location or movements. The Act imposes varying obligations and restrictions on certain (automated) profiling activities that could impact consumers in a legal or similarly significant way or that pose a heightened risk of certain types of harm or negative impacts on consumers.

Sale: In line with the CCPA and the majority of state comprehensive privacy laws, the Act broadly defines “sale” to include “sharing, disclosing or transferring of personal data for monetary or other valuable consideration.”  However, in addition to carving out transfers to processors and transfers to provide a service requested by a consumer, the Act also specifically carves out from “sale” transfers to affiliates and transfers of personal data that a “consumer intentionally made available to the general public through a mass media channel and did not restrict to a specific audience.”

Sensitive Data: Similar to most comprehensive state privacy laws, under the Act,  “sensitive data” includes personal data revealing racial or ethnic origin, religious belief, mental or physical health condition, treatment or diagnosis, sex life or sexual orientation, citizenship or immigration status, genetic or biometric data that may be processed for the purpose of uniquely identifying an individual, personal data collected from a known child, and precise geolocation data. More broadly than most other state privacy laws, “sensitive data” also includes “financial information which shall include a consumer’s account number, account log-in, financial account, or credit or debit card number, in combination with any required security code, access code, or password that would permit access to a consumer’s financial account” and “status as transgender or non-binary.” 

Targeted Advertising: The term “targeted advertising” means advertising to a consumer “based on personal data obtained or inferred from that consumer’s activities over time and across nonaffiliated Internet web sites or online applications.”

Consumer Rights

In line with other state privacy laws in effect, the Act provides consumers with the following rights:

  • Right to access personal data;
  • Right to correct personal data;
  • Right to delete personal data;
  • Right to obtain a copy of personal data;
  • Right to opt out of the processing of personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that product legal or similarly significant effects concerning the consumer; and
  • Right to appeal a controller’s denial of a request to exercise one of the rights above.

Under the Act, consumers can designate an authorized agent to submit opt out requests on their behalf, but not requests to correct, delete, or access information about, or obtain a copy of, their personal data processed by the controller.

Consumers are entitled to at least one free request per year, after which the controller can charge a “reasonable fee” to cover that administrative cost of responding to requests that are “manifestly unfounded, excessive, or repetitive.”  Controllers are not required to respond to requests that they cannot authenticate, except for opt out requests, which do not have to be authenticated.

Key Obligations Under the Act

While most of the obligations apply to controllers, the Act also imposes some direct obligations on processors, including the requirement to assist the controller in meeting its obligations under the Act and to only process personal data in accordance with the controller’s instructions. A processor that processes personal data beyond the controller’s processing instructions will be deemed a controller under the Act (and subject to all of the controller obligations).

The key requirements under the Act include:

  • Privacy Notice: The Act requires controllers to provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes (1) the categories of personal data the controller processes; (2) the purpose for processing; (3) the categories of third parties to which the controller may disclose personal data; (4) the categories of personal data the controller shares with third parties; (5) how a consumer can exercise their privacy rights; (6) the process by which the controller will notify consumers of material changes to the privacy policy; and (7) an active email address or other online mechanism the consumer can use to contact the controller. 

In addition, controllers that sell personal data or process personal data for purposes of targeted advertising, sales, or automated profiling “in furtherance of decisions that produces legal or similarly significant effects concerning a consumer,” must “clearly and conspicuously disclose” such sales and processing and inform consumers of the manner in which they may opt out.

  • Data Protection Assessments: Like majority of existing state comprehensive privacy laws, the Act will require controllers to conduct and document a data protection assessment prior to processing personal data that presents a “heightened risk of harm” to consumers. The definition of heightened risk of harm includes, for example, processing personal data for targeted advertising purposes, selling personal data, processing sensitive data, and processing personal data for the purposes of profiling that presents a reasonably foreseeable risk of certain types of harm (e.g., unlawful disparate impact on consumers, or financial or physical injury).  Processors are required to provide information to the controller as necessary to enable the controller to conduct and document data protection assessments.
  • Consumer Privacy Requests: Under the Act, controllers have 45 days to respond to consumer rights requests, which may be extended for an additional 45 days where “reasonably necessary.”  Processors are required to implement appropriate technical and organizational measures to enable the controller to meet its obligations to respond to consumer privacy requests.
  • Consumer Consent: Under the Act, controllers must obtain consumer consent to process: (1) sensitive data; (2) personal data for purposes that are not reasonably necessary to or compatible with the purposes of collection and processing, as initially disclosed to the consumer; and (3) personal data of individuals between 13 and 17 years old for the purpose of selling the data, serving targeted advertising, or profiling the individual.  Controllers must also provide consumers a mechanism for revoking consent that is as easy as the mechanism for providing consent.
  • Universal Opt-Out Mechanism: Six months from the effective date, the Act requires controllers engaged in targeted advertising or the “sale” of personal data to allow consumers to exercise the right to opt out of such processing through a user-selected universal opt-out mechanism. Further details will be provided in the forthcoming rules and regulations.
  • Collection Limitation: Controllers must limit the collection of personal data to what is adequate, relevant, and reasonably necessary for the purposes disclosed to the consumer and may not process personal data for incompatible purposes without first obtaining consent.
  • Security and Confidentiality: The Act imposes security obligations on both controllers and processors.Controllers are required to establish and maintain administrative, technical, and physical data security measures “appropriate to the volume and nature of the personal data,” including measures to protect the confidentiality, integrity and accessibility of personal data and secure it from unauthorized acquisition “during both storage and use.”  Processors are required to ensure that persons that process personal data are subject to confidentiality obligations and to help controllers meet their obligations to provide data breach notices and maintain reasonable security.

In addition, the Act imposes a joint obligation on both controllers and processors to implement “technical and organizational security measures to ensure a level of security that is appropriate to the risk and establish a clear allocation of the responsibilities between them to implement the measures. 

  • Processor and Subcontractor Contracts: Controllers and processors are required to enter into a written contract that sets forth the processing instructions, identifies the type of personal data and duration of processing, requires the return or deletion of personal data at the end of the engagement, imposes obligations on the processor to demonstrate compliance to the controller and allow for and contribute to reasonable assessments by the controller, and includes other required terms.  Processors are also required to enter into written contracts with subcontractors binding them to comply with the obligations applicable to the processor.
  • Discrimination: Controllers are prohibited from discriminating against consumers for exercising their rights under the Act or from increasing the cost for, or decreasing the availability of, a product or service based “solely on the exercise of a right and unrelated to feasibility or the value” of the service.”

Enforcement

The Act will be enforced solely by the New Jersey Attorney General who may seek penalties of up to $10,000 for the first violation and up to $20,000 for the second and subsequent violations. There is no private right of action available under the Act.

For the first 18 months following the effective date of the Act (January 15th, 2025), there will be a 30-day cure period for violations.  During this time, the Division of Consumer Affairs must issue a notice of a violation to the controller “if a cure is deemed possible,” prior to bringing an enforcement action.  If the violation is not cured within 30 days, the Division of Consumer Affairs can then bring an enforcement action.   The right to cure only applies to violations by controllers—not processors. 


[1] While an earlier version of the bill included a definition for “business” that excluded non-profit entities this definition and exclusion were struck and are not included in the final version.

]]>