Kate Lucente and Lea Lurquin | Privacy Matters | DLA Piper Data Protection and Privacy | DLA Piper https://privacymatters.dlapiper.com/author/klucente/ DLA Piper's Global Privacy and Data Protection Resource Wed, 31 Jul 2024 19:30:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif Kate Lucente and Lea Lurquin | Privacy Matters | DLA Piper Data Protection and Privacy | DLA Piper https://privacymatters.dlapiper.com/author/klucente/ 32 32 FTC Reiterates that Hashed and Pseudonymized Data is Still Identifiable Data https://privacymatters.dlapiper.com/2024/07/ftc-reiterates-that-hashed-and-pseudonymized-data-is-still-identifiable-data/ Fri, 26 Jul 2024 19:16:17 +0000 https://privacymatters.dlapiper.com/?p=7365 Continue Reading]]>

The Federal Trade Commission (FTC) reiterated its long-held view that hashing or pseudonymizing identifiers does not render data anonymous, in a post to its Technology Blog on July 24, 2024.

In the rather strongly worded post, while acknowledging that hashing and pseudonymizing data has the benefit of obscuring the underlying personal data, the FTC adamantly disagrees that it renders personal data anonymous, stating that:

[C]ompanies often claim that hashing allows them to preserve user privacy. This logic is as old as it is flawed – hashes aren’t “anonymous” and can still be used to identify users, and their misuse can lead to harm. Companies should not act or claim as if hashing personal information renders it anonymized.

The FTC emphasized that this has long been the agency’s position, highlighting several prior enforcement actions on this point and also citing 2012 (FTC) Technology Blog post, “Does Hashing Make Data ‘Anonymous’? (Rather than linking to the 2012 blog post, the FTC cheekily wrote: “To save a click, the answer is no, it does not.”)

Unsurprisingly, the FTC seems focused on the use and disclosure of persistent online identifiers that are commonly used to recognize individuals and devices online, such as email addresses, phone numbers, MAC addresses, hashed email addresses, device identifiers and advertising identifiers.  In the post, the FTCstresses that hashing  these identifiers does not relieve a company of its privacy obligations:

Regardless of what they look like, all user identifiers have the powerful capability to identify and track people over time, therefore the opacity of an identifier cannot be an excuse for improper use or disclosure.

The FTC also made clear its position that it is deceptive for a company to claim or treat as anonymous hashed or pseudonymized identifiers that enable the tracking or targeting of an individual or device over time and indicated that this is an area of focus for enforcement:

FTC staff will remain vigilant to ensure companies are following the law and take action when the privacy claims they make are deceptive.

Takeaways?

While this is not a new position or development, the FTC is indicating that it is an area of focus now. It may be a good time to remind digital, advertising, and other teams that online and other persistent identifiers—hashed or otherwise—are still personal data and subject to privacy requirements. It may also make sense to review relevant practices and areas, such as online and in-app identifiers and tracking (analytics, advertising or otherwise) and targeted advertising, including retargeting and custom audience building and list matching.

In addition, businesses may want to review privacy policies and other public-facing privacy statements to make sure they do not claim or imply that hashed or pseudonymized data is anonymous or overstate the privacy benefits of these practices. 

More Information

For more information about these developments and FTC enforcement in general, contact your DLA relationship Partner, the authors of this post, or any member of our Data, Privacy, and Cybersecurity team.

]]>
US: Kentucky Legislature Passes Comprehensive State Privacy Law https://privacymatters.dlapiper.com/2024/04/us-kentucky-legislature-passes-comprehensive-state-privacy-law/ Mon, 29 Apr 2024 18:55:26 +0000 https://privacymatters.dlapiper.com/?p=7310 Continue Reading]]> On April 4, 2024, Kentucky Governor Andy Beshear signed House Bill 15, an act related to Kentucky consumer data privacy (“KCDPA”). Kentucky now joins the expanding list of states with comprehensive state privacy legislation, with the KCDPA set to take effect January 1, 2026.

Scope

The KCDPA applies to entities conducting business in Kentucky, or producing products or services targeted to Kentucky residents, and that during a calendar year meet one of the following criteria:

  • (1) control or process personal data of at least 100,000 Kentucky consumers; or
  • (2) control or process personal data of at least 25,000 Kentucky consumers and derive over 50% of gross revenue from the “sale” of personal data.

The KCDPA includes various entity-level exemptions commonly seen in other state privacy laws, which include, but are not limited to:

  • Any city, state agency, or political subdivision of the state;
  • Financial institutions subject to the Gramm-Leach Bliley Act;
  • Covered entities or business associates governed under the Health Insurance Portability and Accountability Act (“HIPAA”);
  • Nonprofit organizations; and
  • Institutions of higher education.

Like most other state privacy laws, the bill contains data-level exemptions, which include but are not limited to, data processed in accordance with: HIPAA, the Fair Credit Reporting Act, the Driver’s Privacy Protection Act, the Family Educational Rights and Privacy Act, the Farm Credit Act, and the Children’s Online Privacy Protection Act (“COPPA”).

Key Definitions

The definitions under the KCDPA are generally consistent with those of existing comprehensive state privacy laws, with some of the key definitions mentioned below.

Consumer. A “consumer” means a natural person who is a resident of Kentucky acting only in an individual context. A consumer does not include a natural person acting in an employment context.

Personal Data. “Personal data” means any information that is linked or reasonably linkable to an identified or identifiable natural person. Personal data does not include de-identified data or publicly available information.

Profiling. “Profiling” means any form of automated processing performed on personal data to evaluate, analyze, or predict personal aspects related to an identified or identifiable natural person’s economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.

Sale. Under the KCDPA, “sale of personal data” is limited only to the exchange of personal data for monetary consideration by the controller to a third party.

Sensitive Data. “Sensitive data” means a category of personal data that includes (1) personal data indicating racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status; (2) the processing of genetic or biometric data that is processed for the purpose of uniquely identifying a specific natural person; (3) the personal data collected from a known child; or (4) precise geolocation data.

Targeted Advertising. The term “targeted advertising” refers to displaying advertisements to a consumer where the advertisement is selected based on personal data obtained or inferred from that consumer’s activities over time and across nonaffiliated websites or online applications to predict that consumer’s preferences or interest.

Consumer Rights

Consistent with various other state privacy laws currently in effect, the KCDPA provides consumers with the following rights:

  • The right to confirm whether a controller is processing the consumer’s personal data and to access the personal data;
  • The right to correct inaccuracies in the consumer’s personal data;
  • The right to delete personal data provided by or obtained about the consumer;
  • The right to data portability; and
  • The right to opt-out of the processing of personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.

Consumers also have the right to appeal a controller’s refusal to take action on the consumer’s request. Further, controllers are prohibited from discriminating against a consumer for exercising their rights.

Key Obligations

While most obligations apply to controllers, the KCDPA imposes certain direct obligations on processors, including adhering to the instructions of the controller and assisting the controller in meeting its obligations under the KCDPA.

Consistent with other comprehensive state privacy laws, the KCDPA imposes various key obligations on controllers, as discussed below.

Privacy Notice. Controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notices that includes:

  •  The categories of personal data processed by the controller;
  • The purpose for processing personal data;
  • How consumers may exercise their consumers rights, including how a consumer may appeal a controller’s decision with regard to the consumer’s request;
  • The categories of personal data that the controller shares with third parties, if any; and
  • The categories of third parties, if any, with whom the controller shares personal data.

In addition, controllers that “sell” personal data to third parties or processes personal data for targeted advertising are required to conspicuously disclose such activity in the privacy policy, as well as the manner in which a consumer may exercise the right to opt out.

The privacy policy must also include one (1) or more secure and reliable means for consumers to submit a request to exercise their consumers rights.

Consumer Privacy Requests. Under the KCDPA, controllers have 45 days to respond to a consumer’s privacy request, which may be extended an additional 45 days when “reasonably necessary,” provided that the controller informs the consumer of any extension within the initial 45-day response period, together with the reason for the extension.

Data Protection Assessment. The KCDPA requires controllers to conduct and document a data protection impact assessment in the following circumstances:

  • If processing personal data for targeted advertising;
  • If processing personal data for purposes of selling of personal data;
  • If processing personal data for purposes of profiling where the profiling presents a reasonably foreseeable risk to the consumer (i.e., unfair or deceptive treatment, financial, physical or reputational injury, etc.);
  • If processing sensitive data; or
  • If processing personal data presents a heightened risk of harm.

Notably, data protection assessment requirements apply only to processing activities created or generated on or after June 1, 2026.

Consumer Consent. Under the KCDPA, controllers must obtain consumer’s consent to process sensitive data, and personal data for purposes that are neither reasonably necessary to nor compatible with the disclosed purposes for which the personal data is processed as disclosed to the consumer.

Collection Limitation. Controllers must limit the collection of personal data to what is adequate, relevant, and reasonably necessary in relation to the purposes for which the data is processed as disclosed to the consumer.

Security and Confidentiality. The KCDPA requires controllers to implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data.

Universal Opt-Out Mechanism. Unlike many of the recently enacted state privacy laws, the KCDPA does not require recognizing opt out signals as a way to process opt-out requests.

Enforcement

The Attorney General has exclusive authority to enforce violations of the KCDPA, which includes initiating an action to seek damages for up to $7,500 for each violation. The Attorney General may also recover reasonable expenses incurred in investigating and preparing the case, court costs, attorney’s fees, and any other relief ordered by the court of any action initiated under the KCDPA.

Importantly, the KCDPA contains a right to cure provision, which does not sunset. Prior to initiating an action, the Attorney General is required to provide a controller or processor 30 days’ written notice identifying the specific provisions of the KCDPA the Attorney General alleges have been or are being violated. If the violation is not cured within the 30-day period, the Attorney General may then initiate an enforcement action.

Notably, there is no private right of action under the KCDPA.

For more information about these developments, contact the authors of this blog post, your DLA relationship Partner, or any member of DLA’s Data, Privacy and Cybersecurity team.

]]>
US: The FTC Cracks Down on Sensitive Personal Information Disclosures https://privacymatters.dlapiper.com/2024/04/us-the-ftc-cracks-down-on-sensitive-personal-information-disclosures/ Sat, 27 Apr 2024 00:41:47 +0000 https://privacymatters.dlapiper.com/?p=7303 Continue Reading]]> The Federal Trade Commission (“FTC”) is taking bold actions to challenge business’s collection and monetization of consumers’ personal data—particularly sensitive personal data. This month, the FTC reached settlements with a data broker, X-Mode Social and its successor Outlogic LLC (“X-Mode”), and an alcohol addiction treatment firm, Monument Inc. (“Monument”), for, among other things, allegedly selling and/or sharing sensitive personal data to or with third-party advertising firms, without consent and contrary to each company’s public disclosures. These settlements are just two of several notable sensitive data-related enforcement actions by the FTC recently.

In this post, we summarize and provide key takeaways from the FTC’s enforcement against X-Mode and Monument.

I. The FTC’s Order Against X-Mode for Selling and Sharing Sensitive Location Information

The FTC reached an unprecedented settlement with data broker, X-Mode, prohibiting it from disclosing sensitive geolocation information and requiring it to delete or destroy all precise geolocation data previously collected as well as all products or services created with this data, unless it obtains valid consumer consent.

Background

In its complaint, the FTC alleges X-Mode sold precise geolocation data that could be used to track individuals’ visits to sensitive locations such as reproductive health clinics, shelters, medical clinics, or places of worship, in violation of Section 5 of the FTC Act, which prohibits companies to engage in unfair and deceptive trade practices. The FTC alleges X-Mode surreptitiously collected and sold precise geolocation data from millions of users without their consent, in violation of their privacy rights, and in direct opposition to the company’s own public representations.

In particular, the FTC alleges that X-Mode did not adequately disclose the intended use of users’ geolocation data and did not secure valid informed and affirmative consent from users prior to the data collection and/or sharing. Further, the company did not provide users of its own apps (e.g., Drunk Mode and Walk Against Humanity) with transparent notices that describe the purposes for collecting and processing geolocation information and notify that their information would be sold to government contractors for national security purposes. Additionally, X-Mode allegedly failed to honor Android users’ requests to opt-out of such data collection and provided third parties access to these users’ sensitive personal data in conflict with their privacy choices.

Despite having two of its own apps that collect geolocation information, X-Mode primarily relies on third-party app publishers to amass the location information it collects and sells. The FTC claims the company provided sample consumer notices to these third-party app publishers that misled consumers about the purposes for which their location information was being collected, used, and could otherwise be processed. The company also allegedly failed to verify that the third-party app publishers were, on their own, notifying their consumers of the relevant processing purposes and obtaining valid consent.

Additionally, the FTC alleges the company targeted consumers based on sensitive characteristics and failed to remove sensitive geolocation information from the raw location data it sold to third parties downstream. It also failed to implement reasonable or appropriate safeguards to protect against innocuous downstream uses of the location information it sold.

FTC Order Requirements

The FTC’s decision and order prohibits X-Mode from selling or sharing any sensitive location data and requires the company to:

  • delete or destroy all precise geolocation data previously collected as well as all products or services created with this data, unless it obtains valid consumer consent or ensures the data has been de-identified or rendered non-sensitive.
  • maintain a comprehensive record of all sensitive location data it collects and maintains, to ensure it is adequately protecting and not unlawfully selling or sharing this information.
  • develop a supplier assessment program to ensure that third parties who provide location data to X-Mode:
    • obtain affirmative express consent from consumers for the collection, use, and sale of their data and
    • ensure that data brokers/providers are tracking and honoring individuals’ requests to opt out of the sale/disclosure of their data.
  • ensure all recipients of its location data do not associate it with sensitive locations, such as medical facilities, religious institutions, shelters, schools, union offices, and immigrant service offices.
  • notify the FTC within thirty (30) days of determining there was a “third-party incident,” defined as a third-party sharing X-Mode’s location data in violation of its contractual limitations.
  • establish a data retention schedule and implement a comprehensive privacy program that adequately protects consumers’ personal information.

The order specifies that disclosures requesting consumers’ “affirmative express consent” must be “clear and conspicuous” and separate from any existing terms of service, terms of use, or privacy policy and someone hovering over a piece of content on a website, muting content, pausing content, or closing content will not constitute affirmative express consent.

Likewise, the FTC’s order against Monument for certain alleged disclosures of sensitive health data stipulates similar remedial measures.

II. The FTC’s Order Against Monument regarding Disclosures of Sensitive Health Data to Third Parties for Marketing Purposes

The FTC announced a proposed order, prohibiting alcohol addiction treatment company, Monument, from disclosing individuals’ health information to third-party advertising companies and platforms for purposes of targeted advertising without valid consent.

Background

In its complaint, the FTC alleges that Monument used online tracking technologies such as cookies, pixels, APIs, and other similar technologies, to collect personal data about individuals who visited and interacted with Monument’s websites and other online and subscription services. The relevant data includes name, email address, address, phone number, date of birth, IP address, government issued ID, information about alcohol consumption and medical history, device identifiers, and other relevant information about the 84,000 impacted individuals. Once collected, Monument allegedly categorized this information into ‘Custom Events’ and provided the Custom Event information along with email addresses, IP addresses, and other unique identifiers to the third-party advertisers for re-targeting and custom audience purposes, allowing advertisers to identify specific individuals for targeted advertising. The complaint further alleges that Monuments’ contracts with these third-party advertisers did not limit the third parties’ downstream use of the disclosed personal data for their own commercial purposes.

The FTC documented that Monument publicly claimed, in its privacy policy that it was fully compliant with the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) and that any information provided by individuals would be kept “100% confidential, secure and HIPAA compliant.” In addition, the policy stated that Monument would not disclose any personal data, including health information, to third parties without the individual’s written consent.  Nonetheless, the privacy policy simultaneously stated that Monument would disclose personal data, including health information, to third parties including for marketing purposes. 

The FTC claims that these disclosures and representations violate Section 5 of the FTC Act for misrepresentations and deceptive omissions of material facts constituting deceptive practices, and the Opioid Addiction Recovery Fraud Prevention Act of 2018 (“OARFPA”) for unfair or deceptive acts or practices related to information regarding a substance use disorder treatment service and/or product.

FTC Order Requirements

Under the order, along with imposing a (suspended) $2.5 million civil penalty and amongst other things, Monument must:

  • identify all health information the company shared with relevant third parties for unlawful purposes and instruct the third-party recipients to delete such data;
  • provide notice to all impacted individuals about the unlawful disclosure of their personal data, including their health information;
  • not disclose any health information to third parties for advertising purposes;
  • obtain an individual’s affirmative express consent prior to disclosing health information for any purpose other than advertising (which is prohibited under the order); and
  • not make deceptive or misleading statements to promote its services, such as about its HIPAA compliance and its data practices.

Monument is also ordered to implement a comprehensive privacy program to protect the privacy and security of the personal data it collects, retains, and discloses. The privacy program must include:

  • a privacy officer who is a designated and qualified employee that reports to a senior executive and who is responsible for the privacy program;
  • regular assessments of the company’s privacy operations concerning personal data;
  • adequate technical, administrative, and organizational safeguards to protect personal data, including reviews of its relevant contracts with third parties;
  • a data retention policy that limits retention of personal data to the shortest time necessary to fulfill the purposes for which it was collected and the retention schedule must be made publicly available; and
  • processes to maintain records of processing activities that capture the personal data that is collected on behalf of and/or disclosed to a third party.

III. Takeaways

In line with its other recent enforcement actions, these orders underscore the FTC’s commitment to restraining the collection, sale, or disclosure of consumers’ sensitive personal information. Businesses that collect, sell, or otherwise process sensitive personal information, and particularly precise geolocation information and health information, should:

  • Establish and implement a comprehensive privacy program that adequately maps the company’s collection and processing of personal information and protects consumers’ personal information;
  • Conduct due diligence of downstream third-party businesses and service providers to whom it discloses personal information and ensure that adequate contractual terms are in place;
  • Obtain affirmative and informed prior consent from individuals for the collection, use, disclosure and/or sale of their sensitive personal data;
  • Avoid sharing, selling, or otherwise disclosing sensitive geolocation data and health information;
  • Ensure data providers/data brokers who supply the company with personal information are collecting informed, affirmative and valid consent from individuals and honoring opt-outs as necessary; and
  • Review their data retention schedules and practices.

These orders highlight the growing importance of implementing and maintaining a comprehensive, well-rounded privacy program that goes beyond providing a cookie-cutter privacy policy, and the FTC’s willingness to increase oversight and institute significant consequences against those who don’t.

For more information about these developments and FTC enforcement in general, contact your DLA relationship Partner, the authors of this post, or any member of our Data, Privacy and Cyber security team.

]]>
US: CCPA and California Privacy Protection Agency Updates: 2024 to Date https://privacymatters.dlapiper.com/2024/04/ccpa-and-california-privacy-protection-agency-updates-2024-to-date/ Wed, 24 Apr 2024 20:06:52 +0000 https://privacymatters.dlapiper.com/?p=7281 Continue Reading]]> The California Privacy Protection Agency (“CPPA”) has been active since the start of the year.  In this blog post we summarize some key activities of the CPPA to date in 2024, including:

  • On April 2, 2024, the CPPA Enforcement Division issued its inaugural advisory, emphasizing the importance of data minimization.  (Read more about the enforcement advisory below.)
  • In March 2024, the CPPA’s March Board Meeting included several notable developments, including:
    • Draft proposed regulations on risk assessments and automated decision-making technology. Draft updates to existing CCPA Regulations, including updates to the definition of sensitive personal information and requirements relating to verifying and denying consumer requests. A summary of the CPPA’s enforcement priorities for 2024, which include privacy notices, right to delete issues, and the processing of consumer requests.
    • A report on the number of complaints received by the CPPA since July 2023.

(Read more about the March 2024 Board Meeting below.)

  • On February 9, 2024, the CPPA won its appeal of a lower court ruling that delayed for one year the enforcement of the updated CCPA Regulations, implemented pursuant to the California Privacy Rights Act of 2020.   
  • In January 2024, the CPPA launched https://privacy.ca.gov, a new online resource on California privacy rights for consumers.

In 2024, the CPPA has also weighed in on proposed federal and state privacy legislation, issuing a statement heavily critical of the federal American Privacy Rights Act legislation, and strongly supporting California’s AB 3048, which would expand business requirements regarding privacy preference and opt out signals.

CPPA Enforcement Advisory on Data Minimization

On April 2, 2024, the CPPA issued its inaugural enforcement advisory under the California Consumer Privacy Act (“CCPA”) which focused on the need to apply data minimization principles across its processing activities and its processing of consumer privacy requests, emphasizing:

Data minimization is a foundational principle in the CCPA. Businesses should apply this principle to every purpose for which they collect, use, retain, and share consumers’ personal information.

The CPPA also observed that:

[C]ertain businesses are asking consumers to provide excessive and unnecessary personal information in response to requests that consumers make under the CCPA.

As one of many core principles of the CCPA, data minimization requires businesses to restrict the processing of personal information to that which is “reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed.”[1]

Regulations issued pursuant to the CCPA expand on this principle, stating the necessary and proportionate assessment should be based on the following:

  1. The minimum personal information that is necessary to achieve the purpose identified, as disclosed to the consumer;
  2. The possible negative impacts on consumers posed by the business’s collection or processing of personal information; and
  3. Additional safeguards used by the business to address the possible negative impacts on consumers.[2]

Data Minimization in Verifying Consumer Requests. When responding to consumer requests, the CCPA requires businesses to verify that the person making a request to delete, correct, or know is the consumer about whom the business has collected personal information.

The CCPA prohibits businesses from requiring a consumer to verify their identity to make a request to opt-out of the sale/sharing of personal information or to limit use and disclosure of sensitive personal information; however, the business may ask the consumer for information necessary to complete the request.

The CCPA regulations provide businesses with guidance in determining the method by which the business will verify the consumer’s identity:

  1. Whenever feasible, match the identifying information provided by the consumer to the consumer’s personal information the business already maintains, or use a third-party identity verification service;
  2. Avoid collecting certain types of personal information (such as Social Security number, driver’s license number, financial account numbers, or unique biometric data), unless necessary for the purpose of verifying the consumer; and
  3. Consider the following factors, including (i) the type, sensitivity, and value of the personal information collected and maintained about the consumer; (ii) the risk of harm to the consumer; (iii) the likelihood that fraudulent or malicious actors would seek the personal information; (iv) whether the personal information to be provided by the consumer to verify their identity is sufficiently robust to protect against fraudulent requests or being spoofed or fabricated; (v) the manner in which the business interacts with the consumer, and (vi) available technology for verification.[3]

Businesses must generally avoid requesting additional information from the consumer for verification purposes; however, to the extent the business cannot verify the consumer’s identity, the business may request additional information which must only be used for verifying the consumer’s identity, security, or fraud-prevention. The business must delete any new personal information collected for verification purposes as soon as practical after processing the consumer’s request, subject to the CCPA’s record-keeping requirements.

Questions to Consider When Responding to Consumer Requests. The advisory includes illustrative scenarios on the application of the data minimization principle to CCPA requests to opt-out of the sale/sharing of personal information and requests to delete personal information.  The advisory also provides a list of questions for businesses to consider when processing consumer requests:

  1. What is the minimum personal information that is necessary to achieve this purpose?
  2. We already have certain personal information from this consumer. Do we need to ask for more personal information than we already have?
  3. What are the possible negative impacts posed if we collect or use the personal information in this manner?
  4. Are there additional safeguards we could put in place to address the possible negative impacts?

Businesses should keep the above questions in mind when determining how to verify and process consumer requests.

For more information about these developments, contact the authors of this blog post, your DLA relationship Partner, or any member of DLA’s Data, Privacy and Cybersecurity team.

Takeaways from CPPA March 2024 Board Meeting: Enforcement Priorities and Revised Regulations on the Horizon

On March 8, 2024, the CPPA held a public meeting to discuss, among other things, its enforcement priorities and proposed regulations on risk assessments and automated decisionmaking technology (“ADMT”). This article summaries the key takeaways from the meeting and highlights from the new regulations on the horizon in California.

Enforcement Priorities. During the meeting, Michael Macko the Deputy Director for the Enforcement Division presented on enforcement updates and priorities. The presentation reported the CPPA received 1,208 complaints between July 6, 2023, and February 22, 2024. It may come as no surprise to privacy officers and compliance managers that the most common categories of complaints include right to delete and right to opt-out of sale issues. 

The CPPA reported that its upcoming enforcement priorities will be privacy notices, right to delete issues, and implementation of consumer requests.[4]

ADMT and Risk Assessment Regulations. As we recently reported, in late 2023, the CPPA released its initial draft regulations for ADMT and risk assessments. During the March 8, 2023 meeting, the Board was presented with an updated draft of the ADMT and risk assessment regulations and voted to progress these proposed regulations to formal rulemaking. It is important to note that the regulations are discussion drafts that are still in the preliminary rulemaking phase. Staff will begin preparing the required paperwork to initiate formal rulemaking based on the Board’s vote. During the meeting, CPPA General Counsel, Philip Laird, clarified that the Agency intends to do more public engagement this spring and summer for additional feedback on the draft ADMT and risk assessment regulations. On April 24, 2024, the CPPA announced three stakeholder sessions to take place this May. More information about the sessions and how you can attend is available on the CPPA website. Additional modifications may be made to the draft regulations based on feedback from the Board and the public throughout this process.

The following are notable updates to draft ADMT and risk assessment requirements in these new proposed draft regulations:

  • Revised Definition of ADMT. The CPPA has revised the definition of AMDT to mean “any technology that processes personal information and uses computation to execute a decision, replace human decisionmaking, or substantially facilitate human decisionmaking.” For purposes of this definition, the CPPA clarified that to “substantially facilitate human decisionmaking” means using the output of the technology as a key factor in a human’s decisionmaking. This includes, for example, using AMDT to generate a score about a consumer that the human reviewer uses as a primary factor to make a significant decision about them.
  • ADMT Exclusions. The CPPA has clarified that ADMT does not include the following technologies, provided these technologies do not execute a decision, replace human decisionmaking, or substantially facilitate human decisionmaking: web hosting, domain registration, networking, caching, website-loading, data storage, firewalls, anti-virus, anti-malware, spam and robocall-filtering, spellchecking, calculators, databases, spreadsheets, or similar technologies.
  • Revised Definition of Profiling. The CPPA has expanded the definition of profiling to include automated processing of personal information to analyze or predict an individual’s intelligence, ability, aptitude, mental health and predispositions.
  • New Trigger for Notice, Opt-Out and Access. The CPPA has revised the triggers for pre-use notice, opt-out, and access requirements by adding the use of ADMT for “profiling a consumer for behavioral advertising” as a trigger.
  • Updated Pre-Use Notice Requirements for ADMT. The CPPA has updated the pre-use notice requirements to streamline the information that a business must provide, and to allow for greater flexibility in how the business presents the information. The proposed revisions also include tailoring pre-notice requirements to specific uses of ADMT and requiring that the business disclose that they cannot retaliate against consumers.
  • Opt-Out Exceptions for ADMT. Under the proposed regulations, businesses would not be required to provide a consumer with the ability to opt-out of a business’s use of ADMT for a significant decision concerning the consumer if the business provides consumers with the ability to appeal to a human decisionmaker (the “human appeal exception”). To qualify for the human appeal exception, the business must satisfy certain requirements, including but not limited to, designating a qualified human reviewer who must consider relevant information, clearly describing how consumers can submit an appeal, and enabling the consumer to provide information for the human reviewer to consider. The proposed regulations also include an “evaluation exception” where a business does not need to provide a consumer with the ability to opt-out (subject to certain conditions) for purposes of admission, acceptance, or hiring decisions, allocation/assignment of work and compensation decisions, and work or educational profiling. Businesses would also not be required to provide a consumer with the ability to opt-out if the business’s use of the ADMT is necessary for security, fraud prevention, or safety purposes.
  • Revised Risk Assessment Thresholds. The CPPA has revised the risk assessment thresholds to clarify that risk assessments are required when the business (1) sells or shares personal information; (2) processes sensitive personal information (including the personal information of consumers that the business has actual knowledge are less than 16 years of age); (3) uses ADMT for a significant decision or “extensive profiling” (i.e., work or educational profiling, public profiling, or profiling a consumer for behavioral advertising); or (4) processes personal information to train ADMT or artificial intelligence that is capable of being used for a significant decision, to establish identity, for physical or biological profiling, for generating deepfakes, or for operating generative models.
  • Revised Risk Assessment Requirements. The CPPA’s proposed revisions include clarifying which operational elements must be identified in a risk assessment, which negative impacts to a consumers’ privacy a business may consider, and which safeguards a business must identify for ADMT to ensure the ADMT works as intended and does not discriminate.
  • Revised Risk Assessment Submission Requirements. The CPPA has streamlined what must be included in an abridged risk assessment and further clarified exemptions to the risk assessment submission requirements. For example, a business is not required to submit a risk assessment if the business has previously conducted and submitted to the CPPA an abridged risk assessment for a given processing activity, and there were no material changes to that processing during a subsequent submission period (however, the business must still submit a certification of compliance to the Agency).

Draft Updates to Existing CCPA Regulations. In addition to the initial draft regulations for ADMT and risk assessments, the CPPA also discussed revisions to the pre-existing CCPA regulations. Similar to the Risk Assessment and ADMT regulations discussed above, formal rulemaking proceedings are still pending for these proposed amendments, which include the following notable updates:

  • Revised Definition of Sensitive Personal Information. The CPPA proposed revising the definition of sensitive personal information to include “[p]ersonal information of consumers that the business has actual knowledge are less than 16 years of age.” The proposed revisions further clarify that businesses that willfully disregard the consumer’s age shall be deemed to have had actual knowledge of the consumer’s age.
  • Denying Consumer Requests. Under the revised regulations, if the business denies a consumer’s request to know, correct, delete, opt-out of the sale/sharing of personal information, or limit use and disclosure of sensitive personal information, the business must, among other things, inform the consumer that they can file a complaint with the Agency and the Attorney General and provide links to the complaint forms available on their respective websites.
  • Verification of Consumer Requests. Under the revised regulations, businesses would be required to match identifying information provided by the consumer to the personal information of the consumer already maintained by the business before requesting additional information from the consumer (emphasis added).
  • Service Providers and Contractors. The CPPA proposed adding a requirement that any retention, use, or disclosure of personal information by service providers or contractors pursuant to its written contract with a business must be “reasonably necessary and proportionate” for the purposes stated in the contract.

For more information about these developments, contact your DLA Piper relationship partner, the authors of this alert, or any member of our Data Protection, Privacy and Security team.


[1] Civil Code § 1798.100(c)

[2] 11 CCR § 7002(d)

[3] 11 CCR § 7060(c)

[4] See the CPPA Enforcement Update & Priorities presentation available at https://cppa.ca.gov/meetings/materials/20240308_item6_enforcement_update.pdf.

]]>
US: New Hampshire Enacts 15th Comprehensive State Privacy Law https://privacymatters.dlapiper.com/2024/04/us-new-hampshire-enacts-15th-comprehensive-state-privacy-law/ Fri, 12 Apr 2024 21:46:47 +0000 https://privacymatters.dlapiper.com/?p=7260 Continue Reading]]> On March 6, 2024, the New Hampshire Governor signed into law Senate Bill 255 (the “NH Act”), making New Hampshire the 15th state to adopt a comprehensive state privacy law. The NH Act will take effect January 1, 2025. This post explores how the NH Act stacks up against the other comprehensive state privacy laws.

Applicability

The NH Act applies to covered businesses that either conduct business in New Hampshire or produce products or services targeted toward New Hampshire residents, and meet either of the following thresholds during a one-year period:

  • control or process the personal data of not less than 35,000 unique consumers, excluding personal data controlled or processed solely for the purpose of completing a payment transaction; or
  • control or process the personal data of not less than 10,000 unique consumers and derive more than 25 percent of their gross revenue from the sale of personal data.

These thresholds are considerably lower than most other states’ privacy laws. Businesses who may not trigger compliance with other state privacy laws, including those currently in effect (such as California, Colorado, Connecticut, Virginia, and Utah) should review their practices and determine whether these lower thresholds trigger compliance in New Hampshire.

Like many other state privacy laws, the NH Act contains various exemptions such as those for nonprofits, institutes of higher education, financial institutions or data subject to the Gramm-Leach-Bliley Act. Additionally, the NH Act provides several Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) exemptions including those for “covered entities,” “business associates,” and “protected health information” (as these terms are defined under HIPAA).

Key Definitions

The NH Act’s definitions largely align with definitions from other state privacy laws. For instance:

Consent: Like most other state privacy laws, “consent” means “a clear affirmative act signifying a consumer’s freely given, specific, informed and unambiguous agreement to allow the processing of personal data relating to the consumer.” This does not include “acceptance of a general or broad terms of use,” similar methods that bury language regarding processing personal data, or “the use of deceptive design patterns.”

Consumer: Under the NH Act, a “consumer” is “an individual who is a resident of [New Hampshire].” Similar to many other state privacy laws, “consumer” does not include an “individual acting in a commercial or employment context.”

De-identified Data: Means “data that cannot reasonably be used to infer information about, or otherwise be linked to, an identified or identifiable individual, or a device linked to such individual.”

Personal Data: Means “any information that is linked or reasonably linkable to an identified or identifiable individual” but “does not include de-identified data or publicly available information.”

Profiling: Means “any form of automated processing performed on personal data to evaluate, analyze, or predict personal aspects related to an identified or identifiable individual’s economic situation, health, personal preferences, interests, reliability, behavior, location or movements.”

Sale of Personal Data: Means “the exchange of personal data for monetary or other valuable consideration by the controller to a third party.” This does not include disclosures to processors or to third parties for purposes of providing a product or service that the consumer requested. The NH Act also limits this definition by carving out disclosures when the consumer requests that the disclosure occurs or when the consumer intentionally makes the information available to the general public “via a channel of mass media.” Additionally, a “sale of personal data” does not occur when the controller discloses or transfers the information to an affiliate.

Sensitive Data: Means “personal data that includes data revealing racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sex life, sexual orientation or citizenship or immigration status; the processing of genetic or biometric data for the purpose of uniquely identifying an individual; personal data collected from a known child; or, precise geolocation data.”

Targeted Advertising: Means advertising to a consumer “based on personal data obtained or inferred from that consumer’s activities over time and across nonaffiliated Internet web sites or online applications to predict such consumer’s preferences or interests.”

Key Obligations

The NH Act imposes obligations on both controllers and processors, and like most comprehensive privacy laws, the majority of the responsibilities fall on controllers. Similar to other state comprehensive privacy laws, processors must adhere to the controller’s instructions, assist the controller in meeting its obligations, and enter into a data processing agreement with the controller.

Key requirements under the NH Act include:

  • Privacy Notice: Under the NH Act, controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes: (1) the categories of personal data processed by the controller; (2) the purpose for processing personal data; (3) how consumers may exercise their consumer rights, including how a consumer may appeal a controller’s decision; (4) the categories of personal data that the controller shares with third parties, if any; (5) the categories of third-parties, if any, with which the controller shares personal data; and (6) an active electronic mail address or other online mechanism that the consumer may use to contact the controller. Importantly, the notice must meet the standards that the NH Act delegates to the New Hampshire Secretary of State to develop. These standards are forthcoming.
  • Data Minimization & Purpose Limitation: Like most other comprehensive state privacy laws, the NH Act requires controllers to limit the collection of personal data to what is adequate, relevant, and reasonably necessary for the purposes disclosed to the consumer and not process the data for incompatible purposes unless the controller first obtains the consumer’s consent.
  • Security: The NH Act requires controllers to establish, implement, and maintain reasonable administrative, technical and physical data security practices to protect the confidentiality, integrity and accessibility of personal data “appropriate to the volume and nature of the personal data at issue.” Processors must ensure that persons that process personal data are subject to a confidentiality duty for that data and assist controllers in meeting their obligations to provide data breach notices and maintain reasonable security.
  • Opt-Out Preference Signal: By January 1, 2025, the NH Act requires controllers to allow consumers to opt-out of any processing of the consumer’s personal data for the purposes of targeted advertising, or any sale of such personal data, through an opt-out preference signal.
  • Data Protection Assessments: The NH Act also requires controllers to conduct and document data protection assessments for each processing activity that “presents a heightened risk of harm to the consumer.” This includes: (1) the processing of personal data for the purposes of targeted advertising; (2) the sale of personal data; (3) the processing of sensitive data, and (4) profiling, when such profiling presents a reasonably foreseeable risk of:
    • Unfair or deceptive treatment of consumers;
    • Unlawful disparate impact on consumers;
    • Financial, physical or reputational injury to consumers;
    • A physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers, where such intrusion would be offensive to a reasonable person; and
    • Other substantial injury to consumers.

Consumer Rights

In line with other state privacy laws in effect, the NH Act provides consumers with the following rights:

  • Right to access personal data
  • Right to correct inaccuracies in personal data
  • Right to delete personal data
  • Right to obtain a copy of personal data
  • Right to opt-out of the processing of the personal data for purposes of targeted advertising
  • Right to opt-out of the sale of personal data (as defined above)
  • Right to opt-out of profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer
  • Right to appeal a controller’s denial of a request to exercise one of the rights above

A consumer may also designate an authorized agent to submit opt out requests on the consumer’s behalf, but not requests to correct, delete, or access information about, or obtain a copy of, their personal data processed by the controller. Additionally, consumers are entitled to at least one free request per year, after which a controller may charge a “reasonable fee” to cover administrative costs associated with handling the request.

Similar to many other states, the NH Act requires controllers to respond to a rights request within 45 days absent an additional 45-day extension when “reasonably necessary.” The controller must inform the consumer about the extension within the initial 45-day period and provide a rationale for the extension.

Enforcement

The New Hampshire Attorney General (the “Attorney General”) has the exclusive authority to enforce the NH Act. The NH Act does not specify any statutory penalties. Like most other state privacy laws, the NH Act does not provide for a private right of action.

The NH Act also provides covered businesses a 60-day cure period to address alleged violations until December 31, 2025. Beginning January 1, 2026, the Attorney General may provide controllers the opportunity to cure after considering the following factors: (1) the number of violations; (2) the size and complexity of the controller or processor; (3) the nature and extent of the controller’s or processor’s processing activities; (4) the substantial likelihood of injury to the public; (5) the safety of persons or property; and (6) whether such alleged violation was likely caused by human or technical error.

In addition to the NH Act, several other newly adopted privacy laws are set to take effect in 2024, 2025, and beyond. For more information about these developments, please contact your DLA Piper relationship partner, the authors of this alert, or any member of our Data Protection, Privacy and Cybersecurity Practice.

]]>
US: New Jersey Enacts Comprehensive State Privacy Law https://privacymatters.dlapiper.com/2024/02/us-new-jersey-enacts-comprehensive-state-privacy-law/ Tue, 13 Feb 2024 16:27:52 +0000 https://privacymatters.dlapiper.com/?p=7227 Continue Reading]]> On January 16, 2023, the New Jersey Governor signed into law Senate Bill 332 (the “Act”) making New Jersey the 14th state to adopt a comprehensive state privacy law. The Act will take effect on January 15th, 2025, and requires the Division of Consumer Affairs to issue rules and regulations to effectuate the Act; however, the Act does not specify a set timeline for establishing such regulations.

Regulated Entities

The Act applies to entities that conduct business in New Jersey or produce products or services that are targeted to New Jersey residents, and that during a calendar year meet one of the following criteria:

  • control or process the personal data of at least 100,000 New Jersey consumers; or
  • control or process the personal data of at least 25,000 New Jersey consumers and derive revenue, or receive a discount on the price of any goods or services, from the “sale” of personal data.

Unlike many other comprehensive state privacy laws, the Act does not contain an exemption for nonprofits.[1] It does, however, exempt “financial institutions” that are subject to the Gramm-Leach-Bliley Act.  On the other hand, the Act (similar to the CCPA) only exempts “protected health information collected by a covered entity or business associate” subject to HIPAA but does not exempt covered entities (or business associates) in their entirety.  Like most state comprehensive privacy laws, the Act also contains some limited exemptions for personal data subject to certain federal privacy laws and regulations, including (1) personal data sold pursuant to the Drivers’ Privacy Protection Act of 1994, (2) personal data collected, processed, sold, or disclosed by a consumer reporting agency in compliance with the Fair Credit Reporting Act, and (3) personal data collected, processed, or disclosed as part of clinical research conducted in accordance with U.S. federal policy (45 C.F.R. Part 46) or FDA regulations (21 C.F.R. Parts 50 and 56) for the protection of human subjects in clinical research.

Key Definitions

For the most part, the definitions under the Act align to those of existing state comprehensive privacy laws.

Consumer: A “consumer” is “an identified person who is a resident of [New Jersey] acting only in an individual or household context.” As with majority of the other state comprehensive privacy laws (not including the California Consumer Privacy Act or “CCPA”), the Act expressly excludes “a person acting in a commercial or employment context.”

Personal Data: Under the Act“personal data” includes “any information that is linked or reasonably linkable to an identified or identifiable person. . . not [including] de-identified data or publicly available data.”

Profiling: Under the Act, “profiling” means “automated processing” of personal data “to evaluate, analyze or predict. . . an identified or identifiable individual’s economic situation, health, personal preferences, interests, reliability, behavior, location or movements. The Act imposes varying obligations and restrictions on certain (automated) profiling activities that could impact consumers in a legal or similarly significant way or that pose a heightened risk of certain types of harm or negative impacts on consumers.

Sale: In line with the CCPA and the majority of state comprehensive privacy laws, the Act broadly defines “sale” to include “sharing, disclosing or transferring of personal data for monetary or other valuable consideration.”  However, in addition to carving out transfers to processors and transfers to provide a service requested by a consumer, the Act also specifically carves out from “sale” transfers to affiliates and transfers of personal data that a “consumer intentionally made available to the general public through a mass media channel and did not restrict to a specific audience.”

Sensitive Data: Similar to most comprehensive state privacy laws, under the Act,  “sensitive data” includes personal data revealing racial or ethnic origin, religious belief, mental or physical health condition, treatment or diagnosis, sex life or sexual orientation, citizenship or immigration status, genetic or biometric data that may be processed for the purpose of uniquely identifying an individual, personal data collected from a known child, and precise geolocation data. More broadly than most other state privacy laws, “sensitive data” also includes “financial information which shall include a consumer’s account number, account log-in, financial account, or credit or debit card number, in combination with any required security code, access code, or password that would permit access to a consumer’s financial account” and “status as transgender or non-binary.” 

Targeted Advertising: The term “targeted advertising” means advertising to a consumer “based on personal data obtained or inferred from that consumer’s activities over time and across nonaffiliated Internet web sites or online applications.”

Consumer Rights

In line with other state privacy laws in effect, the Act provides consumers with the following rights:

  • Right to access personal data;
  • Right to correct personal data;
  • Right to delete personal data;
  • Right to obtain a copy of personal data;
  • Right to opt out of the processing of personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that product legal or similarly significant effects concerning the consumer; and
  • Right to appeal a controller’s denial of a request to exercise one of the rights above.

Under the Act, consumers can designate an authorized agent to submit opt out requests on their behalf, but not requests to correct, delete, or access information about, or obtain a copy of, their personal data processed by the controller.

Consumers are entitled to at least one free request per year, after which the controller can charge a “reasonable fee” to cover that administrative cost of responding to requests that are “manifestly unfounded, excessive, or repetitive.”  Controllers are not required to respond to requests that they cannot authenticate, except for opt out requests, which do not have to be authenticated.

Key Obligations Under the Act

While most of the obligations apply to controllers, the Act also imposes some direct obligations on processors, including the requirement to assist the controller in meeting its obligations under the Act and to only process personal data in accordance with the controller’s instructions. A processor that processes personal data beyond the controller’s processing instructions will be deemed a controller under the Act (and subject to all of the controller obligations).

The key requirements under the Act include:

  • Privacy Notice: The Act requires controllers to provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes (1) the categories of personal data the controller processes; (2) the purpose for processing; (3) the categories of third parties to which the controller may disclose personal data; (4) the categories of personal data the controller shares with third parties; (5) how a consumer can exercise their privacy rights; (6) the process by which the controller will notify consumers of material changes to the privacy policy; and (7) an active email address or other online mechanism the consumer can use to contact the controller. 

In addition, controllers that sell personal data or process personal data for purposes of targeted advertising, sales, or automated profiling “in furtherance of decisions that produces legal or similarly significant effects concerning a consumer,” must “clearly and conspicuously disclose” such sales and processing and inform consumers of the manner in which they may opt out.

  • Data Protection Assessments: Like majority of existing state comprehensive privacy laws, the Act will require controllers to conduct and document a data protection assessment prior to processing personal data that presents a “heightened risk of harm” to consumers. The definition of heightened risk of harm includes, for example, processing personal data for targeted advertising purposes, selling personal data, processing sensitive data, and processing personal data for the purposes of profiling that presents a reasonably foreseeable risk of certain types of harm (e.g., unlawful disparate impact on consumers, or financial or physical injury).  Processors are required to provide information to the controller as necessary to enable the controller to conduct and document data protection assessments.
  • Consumer Privacy Requests: Under the Act, controllers have 45 days to respond to consumer rights requests, which may be extended for an additional 45 days where “reasonably necessary.”  Processors are required to implement appropriate technical and organizational measures to enable the controller to meet its obligations to respond to consumer privacy requests.
  • Consumer Consent: Under the Act, controllers must obtain consumer consent to process: (1) sensitive data; (2) personal data for purposes that are not reasonably necessary to or compatible with the purposes of collection and processing, as initially disclosed to the consumer; and (3) personal data of individuals between 13 and 17 years old for the purpose of selling the data, serving targeted advertising, or profiling the individual.  Controllers must also provide consumers a mechanism for revoking consent that is as easy as the mechanism for providing consent.
  • Universal Opt-Out Mechanism: Six months from the effective date, the Act requires controllers engaged in targeted advertising or the “sale” of personal data to allow consumers to exercise the right to opt out of such processing through a user-selected universal opt-out mechanism. Further details will be provided in the forthcoming rules and regulations.
  • Collection Limitation: Controllers must limit the collection of personal data to what is adequate, relevant, and reasonably necessary for the purposes disclosed to the consumer and may not process personal data for incompatible purposes without first obtaining consent.
  • Security and Confidentiality: The Act imposes security obligations on both controllers and processors.Controllers are required to establish and maintain administrative, technical, and physical data security measures “appropriate to the volume and nature of the personal data,” including measures to protect the confidentiality, integrity and accessibility of personal data and secure it from unauthorized acquisition “during both storage and use.”  Processors are required to ensure that persons that process personal data are subject to confidentiality obligations and to help controllers meet their obligations to provide data breach notices and maintain reasonable security.

In addition, the Act imposes a joint obligation on both controllers and processors to implement “technical and organizational security measures to ensure a level of security that is appropriate to the risk and establish a clear allocation of the responsibilities between them to implement the measures. 

  • Processor and Subcontractor Contracts: Controllers and processors are required to enter into a written contract that sets forth the processing instructions, identifies the type of personal data and duration of processing, requires the return or deletion of personal data at the end of the engagement, imposes obligations on the processor to demonstrate compliance to the controller and allow for and contribute to reasonable assessments by the controller, and includes other required terms.  Processors are also required to enter into written contracts with subcontractors binding them to comply with the obligations applicable to the processor.
  • Discrimination: Controllers are prohibited from discriminating against consumers for exercising their rights under the Act or from increasing the cost for, or decreasing the availability of, a product or service based “solely on the exercise of a right and unrelated to feasibility or the value” of the service.”

Enforcement

The Act will be enforced solely by the New Jersey Attorney General who may seek penalties of up to $10,000 for the first violation and up to $20,000 for the second and subsequent violations. There is no private right of action available under the Act.

For the first 18 months following the effective date of the Act (January 15th, 2025), there will be a 30-day cure period for violations.  During this time, the Division of Consumer Affairs must issue a notice of a violation to the controller “if a cure is deemed possible,” prior to bringing an enforcement action.  If the violation is not cured within 30 days, the Division of Consumer Affairs can then bring an enforcement action.   The right to cure only applies to violations by controllers—not processors. 


[1] While an earlier version of the bill included a definition for “business” that excluded non-profit entities this definition and exclusion were struck and are not included in the final version.

]]>