| Privacy Matters DLA Piper's Global Privacy and Data Protection Resource Wed, 31 Jul 2024 19:30:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif | Privacy Matters 32 32 FTC Reiterates that Hashed and Pseudonymized Data is Still Identifiable Data https://privacymatters.dlapiper.com/2024/07/ftc-reiterates-that-hashed-and-pseudonymized-data-is-still-identifiable-data/ Fri, 26 Jul 2024 19:16:17 +0000 https://privacymatters.dlapiper.com/?p=7365 Continue Reading]]>

The Federal Trade Commission (FTC) reiterated its long-held view that hashing or pseudonymizing identifiers does not render data anonymous, in a post to its Technology Blog on July 24, 2024.

In the rather strongly worded post, while acknowledging that hashing and pseudonymizing data has the benefit of obscuring the underlying personal data, the FTC adamantly disagrees that it renders personal data anonymous, stating that:

[C]ompanies often claim that hashing allows them to preserve user privacy. This logic is as old as it is flawed – hashes aren’t “anonymous” and can still be used to identify users, and their misuse can lead to harm. Companies should not act or claim as if hashing personal information renders it anonymized.

The FTC emphasized that this has long been the agency’s position, highlighting several prior enforcement actions on this point and also citing 2012 (FTC) Technology Blog post, “Does Hashing Make Data ‘Anonymous’? (Rather than linking to the 2012 blog post, the FTC cheekily wrote: “To save a click, the answer is no, it does not.”)

Unsurprisingly, the FTC seems focused on the use and disclosure of persistent online identifiers that are commonly used to recognize individuals and devices online, such as email addresses, phone numbers, MAC addresses, hashed email addresses, device identifiers and advertising identifiers.  In the post, the FTCstresses that hashing  these identifiers does not relieve a company of its privacy obligations:

Regardless of what they look like, all user identifiers have the powerful capability to identify and track people over time, therefore the opacity of an identifier cannot be an excuse for improper use or disclosure.

The FTC also made clear its position that it is deceptive for a company to claim or treat as anonymous hashed or pseudonymized identifiers that enable the tracking or targeting of an individual or device over time and indicated that this is an area of focus for enforcement:

FTC staff will remain vigilant to ensure companies are following the law and take action when the privacy claims they make are deceptive.

Takeaways?

While this is not a new position or development, the FTC is indicating that it is an area of focus now. It may be a good time to remind digital, advertising, and other teams that online and other persistent identifiers—hashed or otherwise—are still personal data and subject to privacy requirements. It may also make sense to review relevant practices and areas, such as online and in-app identifiers and tracking (analytics, advertising or otherwise) and targeted advertising, including retargeting and custom audience building and list matching.

In addition, businesses may want to review privacy policies and other public-facing privacy statements to make sure they do not claim or imply that hashed or pseudonymized data is anonymous or overstate the privacy benefits of these practices. 

More Information

For more information about these developments and FTC enforcement in general, contact your DLA relationship Partner, the authors of this post, or any member of our Data, Privacy, and Cybersecurity team.

]]>
US: Kentucky Legislature Passes Comprehensive State Privacy Law https://privacymatters.dlapiper.com/2024/04/us-kentucky-legislature-passes-comprehensive-state-privacy-law/ Mon, 29 Apr 2024 18:55:26 +0000 https://privacymatters.dlapiper.com/?p=7310 Continue Reading]]> On April 4, 2024, Kentucky Governor Andy Beshear signed House Bill 15, an act related to Kentucky consumer data privacy (“KCDPA”). Kentucky now joins the expanding list of states with comprehensive state privacy legislation, with the KCDPA set to take effect January 1, 2026.

Scope

The KCDPA applies to entities conducting business in Kentucky, or producing products or services targeted to Kentucky residents, and that during a calendar year meet one of the following criteria:

  • (1) control or process personal data of at least 100,000 Kentucky consumers; or
  • (2) control or process personal data of at least 25,000 Kentucky consumers and derive over 50% of gross revenue from the “sale” of personal data.

The KCDPA includes various entity-level exemptions commonly seen in other state privacy laws, which include, but are not limited to:

  • Any city, state agency, or political subdivision of the state;
  • Financial institutions subject to the Gramm-Leach Bliley Act;
  • Covered entities or business associates governed under the Health Insurance Portability and Accountability Act (“HIPAA”);
  • Nonprofit organizations; and
  • Institutions of higher education.

Like most other state privacy laws, the bill contains data-level exemptions, which include but are not limited to, data processed in accordance with: HIPAA, the Fair Credit Reporting Act, the Driver’s Privacy Protection Act, the Family Educational Rights and Privacy Act, the Farm Credit Act, and the Children’s Online Privacy Protection Act (“COPPA”).

Key Definitions

The definitions under the KCDPA are generally consistent with those of existing comprehensive state privacy laws, with some of the key definitions mentioned below.

Consumer. A “consumer” means a natural person who is a resident of Kentucky acting only in an individual context. A consumer does not include a natural person acting in an employment context.

Personal Data. “Personal data” means any information that is linked or reasonably linkable to an identified or identifiable natural person. Personal data does not include de-identified data or publicly available information.

Profiling. “Profiling” means any form of automated processing performed on personal data to evaluate, analyze, or predict personal aspects related to an identified or identifiable natural person’s economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.

Sale. Under the KCDPA, “sale of personal data” is limited only to the exchange of personal data for monetary consideration by the controller to a third party.

Sensitive Data. “Sensitive data” means a category of personal data that includes (1) personal data indicating racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status; (2) the processing of genetic or biometric data that is processed for the purpose of uniquely identifying a specific natural person; (3) the personal data collected from a known child; or (4) precise geolocation data.

Targeted Advertising. The term “targeted advertising” refers to displaying advertisements to a consumer where the advertisement is selected based on personal data obtained or inferred from that consumer’s activities over time and across nonaffiliated websites or online applications to predict that consumer’s preferences or interest.

Consumer Rights

Consistent with various other state privacy laws currently in effect, the KCDPA provides consumers with the following rights:

  • The right to confirm whether a controller is processing the consumer’s personal data and to access the personal data;
  • The right to correct inaccuracies in the consumer’s personal data;
  • The right to delete personal data provided by or obtained about the consumer;
  • The right to data portability; and
  • The right to opt-out of the processing of personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer.

Consumers also have the right to appeal a controller’s refusal to take action on the consumer’s request. Further, controllers are prohibited from discriminating against a consumer for exercising their rights.

Key Obligations

While most obligations apply to controllers, the KCDPA imposes certain direct obligations on processors, including adhering to the instructions of the controller and assisting the controller in meeting its obligations under the KCDPA.

Consistent with other comprehensive state privacy laws, the KCDPA imposes various key obligations on controllers, as discussed below.

Privacy Notice. Controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notices that includes:

  •  The categories of personal data processed by the controller;
  • The purpose for processing personal data;
  • How consumers may exercise their consumers rights, including how a consumer may appeal a controller’s decision with regard to the consumer’s request;
  • The categories of personal data that the controller shares with third parties, if any; and
  • The categories of third parties, if any, with whom the controller shares personal data.

In addition, controllers that “sell” personal data to third parties or processes personal data for targeted advertising are required to conspicuously disclose such activity in the privacy policy, as well as the manner in which a consumer may exercise the right to opt out.

The privacy policy must also include one (1) or more secure and reliable means for consumers to submit a request to exercise their consumers rights.

Consumer Privacy Requests. Under the KCDPA, controllers have 45 days to respond to a consumer’s privacy request, which may be extended an additional 45 days when “reasonably necessary,” provided that the controller informs the consumer of any extension within the initial 45-day response period, together with the reason for the extension.

Data Protection Assessment. The KCDPA requires controllers to conduct and document a data protection impact assessment in the following circumstances:

  • If processing personal data for targeted advertising;
  • If processing personal data for purposes of selling of personal data;
  • If processing personal data for purposes of profiling where the profiling presents a reasonably foreseeable risk to the consumer (i.e., unfair or deceptive treatment, financial, physical or reputational injury, etc.);
  • If processing sensitive data; or
  • If processing personal data presents a heightened risk of harm.

Notably, data protection assessment requirements apply only to processing activities created or generated on or after June 1, 2026.

Consumer Consent. Under the KCDPA, controllers must obtain consumer’s consent to process sensitive data, and personal data for purposes that are neither reasonably necessary to nor compatible with the disclosed purposes for which the personal data is processed as disclosed to the consumer.

Collection Limitation. Controllers must limit the collection of personal data to what is adequate, relevant, and reasonably necessary in relation to the purposes for which the data is processed as disclosed to the consumer.

Security and Confidentiality. The KCDPA requires controllers to implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data.

Universal Opt-Out Mechanism. Unlike many of the recently enacted state privacy laws, the KCDPA does not require recognizing opt out signals as a way to process opt-out requests.

Enforcement

The Attorney General has exclusive authority to enforce violations of the KCDPA, which includes initiating an action to seek damages for up to $7,500 for each violation. The Attorney General may also recover reasonable expenses incurred in investigating and preparing the case, court costs, attorney’s fees, and any other relief ordered by the court of any action initiated under the KCDPA.

Importantly, the KCDPA contains a right to cure provision, which does not sunset. Prior to initiating an action, the Attorney General is required to provide a controller or processor 30 days’ written notice identifying the specific provisions of the KCDPA the Attorney General alleges have been or are being violated. If the violation is not cured within the 30-day period, the Attorney General may then initiate an enforcement action.

Notably, there is no private right of action under the KCDPA.

For more information about these developments, contact the authors of this blog post, your DLA relationship Partner, or any member of DLA’s Data, Privacy and Cybersecurity team.

]]>
US: The FTC Cracks Down on Sensitive Personal Information Disclosures https://privacymatters.dlapiper.com/2024/04/us-the-ftc-cracks-down-on-sensitive-personal-information-disclosures/ Sat, 27 Apr 2024 00:41:47 +0000 https://privacymatters.dlapiper.com/?p=7303 Continue Reading]]> The Federal Trade Commission (“FTC”) is taking bold actions to challenge business’s collection and monetization of consumers’ personal data—particularly sensitive personal data. This month, the FTC reached settlements with a data broker, X-Mode Social and its successor Outlogic LLC (“X-Mode”), and an alcohol addiction treatment firm, Monument Inc. (“Monument”), for, among other things, allegedly selling and/or sharing sensitive personal data to or with third-party advertising firms, without consent and contrary to each company’s public disclosures. These settlements are just two of several notable sensitive data-related enforcement actions by the FTC recently.

In this post, we summarize and provide key takeaways from the FTC’s enforcement against X-Mode and Monument.

I. The FTC’s Order Against X-Mode for Selling and Sharing Sensitive Location Information

The FTC reached an unprecedented settlement with data broker, X-Mode, prohibiting it from disclosing sensitive geolocation information and requiring it to delete or destroy all precise geolocation data previously collected as well as all products or services created with this data, unless it obtains valid consumer consent.

Background

In its complaint, the FTC alleges X-Mode sold precise geolocation data that could be used to track individuals’ visits to sensitive locations such as reproductive health clinics, shelters, medical clinics, or places of worship, in violation of Section 5 of the FTC Act, which prohibits companies to engage in unfair and deceptive trade practices. The FTC alleges X-Mode surreptitiously collected and sold precise geolocation data from millions of users without their consent, in violation of their privacy rights, and in direct opposition to the company’s own public representations.

In particular, the FTC alleges that X-Mode did not adequately disclose the intended use of users’ geolocation data and did not secure valid informed and affirmative consent from users prior to the data collection and/or sharing. Further, the company did not provide users of its own apps (e.g., Drunk Mode and Walk Against Humanity) with transparent notices that describe the purposes for collecting and processing geolocation information and notify that their information would be sold to government contractors for national security purposes. Additionally, X-Mode allegedly failed to honor Android users’ requests to opt-out of such data collection and provided third parties access to these users’ sensitive personal data in conflict with their privacy choices.

Despite having two of its own apps that collect geolocation information, X-Mode primarily relies on third-party app publishers to amass the location information it collects and sells. The FTC claims the company provided sample consumer notices to these third-party app publishers that misled consumers about the purposes for which their location information was being collected, used, and could otherwise be processed. The company also allegedly failed to verify that the third-party app publishers were, on their own, notifying their consumers of the relevant processing purposes and obtaining valid consent.

Additionally, the FTC alleges the company targeted consumers based on sensitive characteristics and failed to remove sensitive geolocation information from the raw location data it sold to third parties downstream. It also failed to implement reasonable or appropriate safeguards to protect against innocuous downstream uses of the location information it sold.

FTC Order Requirements

The FTC’s decision and order prohibits X-Mode from selling or sharing any sensitive location data and requires the company to:

  • delete or destroy all precise geolocation data previously collected as well as all products or services created with this data, unless it obtains valid consumer consent or ensures the data has been de-identified or rendered non-sensitive.
  • maintain a comprehensive record of all sensitive location data it collects and maintains, to ensure it is adequately protecting and not unlawfully selling or sharing this information.
  • develop a supplier assessment program to ensure that third parties who provide location data to X-Mode:
    • obtain affirmative express consent from consumers for the collection, use, and sale of their data and
    • ensure that data brokers/providers are tracking and honoring individuals’ requests to opt out of the sale/disclosure of their data.
  • ensure all recipients of its location data do not associate it with sensitive locations, such as medical facilities, religious institutions, shelters, schools, union offices, and immigrant service offices.
  • notify the FTC within thirty (30) days of determining there was a “third-party incident,” defined as a third-party sharing X-Mode’s location data in violation of its contractual limitations.
  • establish a data retention schedule and implement a comprehensive privacy program that adequately protects consumers’ personal information.

The order specifies that disclosures requesting consumers’ “affirmative express consent” must be “clear and conspicuous” and separate from any existing terms of service, terms of use, or privacy policy and someone hovering over a piece of content on a website, muting content, pausing content, or closing content will not constitute affirmative express consent.

Likewise, the FTC’s order against Monument for certain alleged disclosures of sensitive health data stipulates similar remedial measures.

II. The FTC’s Order Against Monument regarding Disclosures of Sensitive Health Data to Third Parties for Marketing Purposes

The FTC announced a proposed order, prohibiting alcohol addiction treatment company, Monument, from disclosing individuals’ health information to third-party advertising companies and platforms for purposes of targeted advertising without valid consent.

Background

In its complaint, the FTC alleges that Monument used online tracking technologies such as cookies, pixels, APIs, and other similar technologies, to collect personal data about individuals who visited and interacted with Monument’s websites and other online and subscription services. The relevant data includes name, email address, address, phone number, date of birth, IP address, government issued ID, information about alcohol consumption and medical history, device identifiers, and other relevant information about the 84,000 impacted individuals. Once collected, Monument allegedly categorized this information into ‘Custom Events’ and provided the Custom Event information along with email addresses, IP addresses, and other unique identifiers to the third-party advertisers for re-targeting and custom audience purposes, allowing advertisers to identify specific individuals for targeted advertising. The complaint further alleges that Monuments’ contracts with these third-party advertisers did not limit the third parties’ downstream use of the disclosed personal data for their own commercial purposes.

The FTC documented that Monument publicly claimed, in its privacy policy that it was fully compliant with the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) and that any information provided by individuals would be kept “100% confidential, secure and HIPAA compliant.” In addition, the policy stated that Monument would not disclose any personal data, including health information, to third parties without the individual’s written consent.  Nonetheless, the privacy policy simultaneously stated that Monument would disclose personal data, including health information, to third parties including for marketing purposes. 

The FTC claims that these disclosures and representations violate Section 5 of the FTC Act for misrepresentations and deceptive omissions of material facts constituting deceptive practices, and the Opioid Addiction Recovery Fraud Prevention Act of 2018 (“OARFPA”) for unfair or deceptive acts or practices related to information regarding a substance use disorder treatment service and/or product.

FTC Order Requirements

Under the order, along with imposing a (suspended) $2.5 million civil penalty and amongst other things, Monument must:

  • identify all health information the company shared with relevant third parties for unlawful purposes and instruct the third-party recipients to delete such data;
  • provide notice to all impacted individuals about the unlawful disclosure of their personal data, including their health information;
  • not disclose any health information to third parties for advertising purposes;
  • obtain an individual’s affirmative express consent prior to disclosing health information for any purpose other than advertising (which is prohibited under the order); and
  • not make deceptive or misleading statements to promote its services, such as about its HIPAA compliance and its data practices.

Monument is also ordered to implement a comprehensive privacy program to protect the privacy and security of the personal data it collects, retains, and discloses. The privacy program must include:

  • a privacy officer who is a designated and qualified employee that reports to a senior executive and who is responsible for the privacy program;
  • regular assessments of the company’s privacy operations concerning personal data;
  • adequate technical, administrative, and organizational safeguards to protect personal data, including reviews of its relevant contracts with third parties;
  • a data retention policy that limits retention of personal data to the shortest time necessary to fulfill the purposes for which it was collected and the retention schedule must be made publicly available; and
  • processes to maintain records of processing activities that capture the personal data that is collected on behalf of and/or disclosed to a third party.

III. Takeaways

In line with its other recent enforcement actions, these orders underscore the FTC’s commitment to restraining the collection, sale, or disclosure of consumers’ sensitive personal information. Businesses that collect, sell, or otherwise process sensitive personal information, and particularly precise geolocation information and health information, should:

  • Establish and implement a comprehensive privacy program that adequately maps the company’s collection and processing of personal information and protects consumers’ personal information;
  • Conduct due diligence of downstream third-party businesses and service providers to whom it discloses personal information and ensure that adequate contractual terms are in place;
  • Obtain affirmative and informed prior consent from individuals for the collection, use, disclosure and/or sale of their sensitive personal data;
  • Avoid sharing, selling, or otherwise disclosing sensitive geolocation data and health information;
  • Ensure data providers/data brokers who supply the company with personal information are collecting informed, affirmative and valid consent from individuals and honoring opt-outs as necessary; and
  • Review their data retention schedules and practices.

These orders highlight the growing importance of implementing and maintaining a comprehensive, well-rounded privacy program that goes beyond providing a cookie-cutter privacy policy, and the FTC’s willingness to increase oversight and institute significant consequences against those who don’t.

For more information about these developments and FTC enforcement in general, contact your DLA relationship Partner, the authors of this post, or any member of our Data, Privacy and Cyber security team.

]]>
US: New Hampshire Enacts 15th Comprehensive State Privacy Law https://privacymatters.dlapiper.com/2024/04/us-new-hampshire-enacts-15th-comprehensive-state-privacy-law/ Fri, 12 Apr 2024 21:46:47 +0000 https://privacymatters.dlapiper.com/?p=7260 Continue Reading]]> On March 6, 2024, the New Hampshire Governor signed into law Senate Bill 255 (the “NH Act”), making New Hampshire the 15th state to adopt a comprehensive state privacy law. The NH Act will take effect January 1, 2025. This post explores how the NH Act stacks up against the other comprehensive state privacy laws.

Applicability

The NH Act applies to covered businesses that either conduct business in New Hampshire or produce products or services targeted toward New Hampshire residents, and meet either of the following thresholds during a one-year period:

  • control or process the personal data of not less than 35,000 unique consumers, excluding personal data controlled or processed solely for the purpose of completing a payment transaction; or
  • control or process the personal data of not less than 10,000 unique consumers and derive more than 25 percent of their gross revenue from the sale of personal data.

These thresholds are considerably lower than most other states’ privacy laws. Businesses who may not trigger compliance with other state privacy laws, including those currently in effect (such as California, Colorado, Connecticut, Virginia, and Utah) should review their practices and determine whether these lower thresholds trigger compliance in New Hampshire.

Like many other state privacy laws, the NH Act contains various exemptions such as those for nonprofits, institutes of higher education, financial institutions or data subject to the Gramm-Leach-Bliley Act. Additionally, the NH Act provides several Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) exemptions including those for “covered entities,” “business associates,” and “protected health information” (as these terms are defined under HIPAA).

Key Definitions

The NH Act’s definitions largely align with definitions from other state privacy laws. For instance:

Consent: Like most other state privacy laws, “consent” means “a clear affirmative act signifying a consumer’s freely given, specific, informed and unambiguous agreement to allow the processing of personal data relating to the consumer.” This does not include “acceptance of a general or broad terms of use,” similar methods that bury language regarding processing personal data, or “the use of deceptive design patterns.”

Consumer: Under the NH Act, a “consumer” is “an individual who is a resident of [New Hampshire].” Similar to many other state privacy laws, “consumer” does not include an “individual acting in a commercial or employment context.”

De-identified Data: Means “data that cannot reasonably be used to infer information about, or otherwise be linked to, an identified or identifiable individual, or a device linked to such individual.”

Personal Data: Means “any information that is linked or reasonably linkable to an identified or identifiable individual” but “does not include de-identified data or publicly available information.”

Profiling: Means “any form of automated processing performed on personal data to evaluate, analyze, or predict personal aspects related to an identified or identifiable individual’s economic situation, health, personal preferences, interests, reliability, behavior, location or movements.”

Sale of Personal Data: Means “the exchange of personal data for monetary or other valuable consideration by the controller to a third party.” This does not include disclosures to processors or to third parties for purposes of providing a product or service that the consumer requested. The NH Act also limits this definition by carving out disclosures when the consumer requests that the disclosure occurs or when the consumer intentionally makes the information available to the general public “via a channel of mass media.” Additionally, a “sale of personal data” does not occur when the controller discloses or transfers the information to an affiliate.

Sensitive Data: Means “personal data that includes data revealing racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sex life, sexual orientation or citizenship or immigration status; the processing of genetic or biometric data for the purpose of uniquely identifying an individual; personal data collected from a known child; or, precise geolocation data.”

Targeted Advertising: Means advertising to a consumer “based on personal data obtained or inferred from that consumer’s activities over time and across nonaffiliated Internet web sites or online applications to predict such consumer’s preferences or interests.”

Key Obligations

The NH Act imposes obligations on both controllers and processors, and like most comprehensive privacy laws, the majority of the responsibilities fall on controllers. Similar to other state comprehensive privacy laws, processors must adhere to the controller’s instructions, assist the controller in meeting its obligations, and enter into a data processing agreement with the controller.

Key requirements under the NH Act include:

  • Privacy Notice: Under the NH Act, controllers must provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes: (1) the categories of personal data processed by the controller; (2) the purpose for processing personal data; (3) how consumers may exercise their consumer rights, including how a consumer may appeal a controller’s decision; (4) the categories of personal data that the controller shares with third parties, if any; (5) the categories of third-parties, if any, with which the controller shares personal data; and (6) an active electronic mail address or other online mechanism that the consumer may use to contact the controller. Importantly, the notice must meet the standards that the NH Act delegates to the New Hampshire Secretary of State to develop. These standards are forthcoming.
  • Data Minimization & Purpose Limitation: Like most other comprehensive state privacy laws, the NH Act requires controllers to limit the collection of personal data to what is adequate, relevant, and reasonably necessary for the purposes disclosed to the consumer and not process the data for incompatible purposes unless the controller first obtains the consumer’s consent.
  • Security: The NH Act requires controllers to establish, implement, and maintain reasonable administrative, technical and physical data security practices to protect the confidentiality, integrity and accessibility of personal data “appropriate to the volume and nature of the personal data at issue.” Processors must ensure that persons that process personal data are subject to a confidentiality duty for that data and assist controllers in meeting their obligations to provide data breach notices and maintain reasonable security.
  • Opt-Out Preference Signal: By January 1, 2025, the NH Act requires controllers to allow consumers to opt-out of any processing of the consumer’s personal data for the purposes of targeted advertising, or any sale of such personal data, through an opt-out preference signal.
  • Data Protection Assessments: The NH Act also requires controllers to conduct and document data protection assessments for each processing activity that “presents a heightened risk of harm to the consumer.” This includes: (1) the processing of personal data for the purposes of targeted advertising; (2) the sale of personal data; (3) the processing of sensitive data, and (4) profiling, when such profiling presents a reasonably foreseeable risk of:
    • Unfair or deceptive treatment of consumers;
    • Unlawful disparate impact on consumers;
    • Financial, physical or reputational injury to consumers;
    • A physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers, where such intrusion would be offensive to a reasonable person; and
    • Other substantial injury to consumers.

Consumer Rights

In line with other state privacy laws in effect, the NH Act provides consumers with the following rights:

  • Right to access personal data
  • Right to correct inaccuracies in personal data
  • Right to delete personal data
  • Right to obtain a copy of personal data
  • Right to opt-out of the processing of the personal data for purposes of targeted advertising
  • Right to opt-out of the sale of personal data (as defined above)
  • Right to opt-out of profiling in furtherance of solely automated decisions that produce legal or similarly significant effects concerning the consumer
  • Right to appeal a controller’s denial of a request to exercise one of the rights above

A consumer may also designate an authorized agent to submit opt out requests on the consumer’s behalf, but not requests to correct, delete, or access information about, or obtain a copy of, their personal data processed by the controller. Additionally, consumers are entitled to at least one free request per year, after which a controller may charge a “reasonable fee” to cover administrative costs associated with handling the request.

Similar to many other states, the NH Act requires controllers to respond to a rights request within 45 days absent an additional 45-day extension when “reasonably necessary.” The controller must inform the consumer about the extension within the initial 45-day period and provide a rationale for the extension.

Enforcement

The New Hampshire Attorney General (the “Attorney General”) has the exclusive authority to enforce the NH Act. The NH Act does not specify any statutory penalties. Like most other state privacy laws, the NH Act does not provide for a private right of action.

The NH Act also provides covered businesses a 60-day cure period to address alleged violations until December 31, 2025. Beginning January 1, 2026, the Attorney General may provide controllers the opportunity to cure after considering the following factors: (1) the number of violations; (2) the size and complexity of the controller or processor; (3) the nature and extent of the controller’s or processor’s processing activities; (4) the substantial likelihood of injury to the public; (5) the safety of persons or property; and (6) whether such alleged violation was likely caused by human or technical error.

In addition to the NH Act, several other newly adopted privacy laws are set to take effect in 2024, 2025, and beyond. For more information about these developments, please contact your DLA Piper relationship partner, the authors of this alert, or any member of our Data Protection, Privacy and Cybersecurity Practice.

]]>
US: New Jersey Enacts Comprehensive State Privacy Law https://privacymatters.dlapiper.com/2024/02/us-new-jersey-enacts-comprehensive-state-privacy-law/ Tue, 13 Feb 2024 16:27:52 +0000 https://privacymatters.dlapiper.com/?p=7227 Continue Reading]]> On January 16, 2023, the New Jersey Governor signed into law Senate Bill 332 (the “Act”) making New Jersey the 14th state to adopt a comprehensive state privacy law. The Act will take effect on January 15th, 2025, and requires the Division of Consumer Affairs to issue rules and regulations to effectuate the Act; however, the Act does not specify a set timeline for establishing such regulations.

Regulated Entities

The Act applies to entities that conduct business in New Jersey or produce products or services that are targeted to New Jersey residents, and that during a calendar year meet one of the following criteria:

  • control or process the personal data of at least 100,000 New Jersey consumers; or
  • control or process the personal data of at least 25,000 New Jersey consumers and derive revenue, or receive a discount on the price of any goods or services, from the “sale” of personal data.

Unlike many other comprehensive state privacy laws, the Act does not contain an exemption for nonprofits.[1] It does, however, exempt “financial institutions” that are subject to the Gramm-Leach-Bliley Act.  On the other hand, the Act (similar to the CCPA) only exempts “protected health information collected by a covered entity or business associate” subject to HIPAA but does not exempt covered entities (or business associates) in their entirety.  Like most state comprehensive privacy laws, the Act also contains some limited exemptions for personal data subject to certain federal privacy laws and regulations, including (1) personal data sold pursuant to the Drivers’ Privacy Protection Act of 1994, (2) personal data collected, processed, sold, or disclosed by a consumer reporting agency in compliance with the Fair Credit Reporting Act, and (3) personal data collected, processed, or disclosed as part of clinical research conducted in accordance with U.S. federal policy (45 C.F.R. Part 46) or FDA regulations (21 C.F.R. Parts 50 and 56) for the protection of human subjects in clinical research.

Key Definitions

For the most part, the definitions under the Act align to those of existing state comprehensive privacy laws.

Consumer: A “consumer” is “an identified person who is a resident of [New Jersey] acting only in an individual or household context.” As with majority of the other state comprehensive privacy laws (not including the California Consumer Privacy Act or “CCPA”), the Act expressly excludes “a person acting in a commercial or employment context.”

Personal Data: Under the Act“personal data” includes “any information that is linked or reasonably linkable to an identified or identifiable person. . . not [including] de-identified data or publicly available data.”

Profiling: Under the Act, “profiling” means “automated processing” of personal data “to evaluate, analyze or predict. . . an identified or identifiable individual’s economic situation, health, personal preferences, interests, reliability, behavior, location or movements. The Act imposes varying obligations and restrictions on certain (automated) profiling activities that could impact consumers in a legal or similarly significant way or that pose a heightened risk of certain types of harm or negative impacts on consumers.

Sale: In line with the CCPA and the majority of state comprehensive privacy laws, the Act broadly defines “sale” to include “sharing, disclosing or transferring of personal data for monetary or other valuable consideration.”  However, in addition to carving out transfers to processors and transfers to provide a service requested by a consumer, the Act also specifically carves out from “sale” transfers to affiliates and transfers of personal data that a “consumer intentionally made available to the general public through a mass media channel and did not restrict to a specific audience.”

Sensitive Data: Similar to most comprehensive state privacy laws, under the Act,  “sensitive data” includes personal data revealing racial or ethnic origin, religious belief, mental or physical health condition, treatment or diagnosis, sex life or sexual orientation, citizenship or immigration status, genetic or biometric data that may be processed for the purpose of uniquely identifying an individual, personal data collected from a known child, and precise geolocation data. More broadly than most other state privacy laws, “sensitive data” also includes “financial information which shall include a consumer’s account number, account log-in, financial account, or credit or debit card number, in combination with any required security code, access code, or password that would permit access to a consumer’s financial account” and “status as transgender or non-binary.” 

Targeted Advertising: The term “targeted advertising” means advertising to a consumer “based on personal data obtained or inferred from that consumer’s activities over time and across nonaffiliated Internet web sites or online applications.”

Consumer Rights

In line with other state privacy laws in effect, the Act provides consumers with the following rights:

  • Right to access personal data;
  • Right to correct personal data;
  • Right to delete personal data;
  • Right to obtain a copy of personal data;
  • Right to opt out of the processing of personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of decisions that product legal or similarly significant effects concerning the consumer; and
  • Right to appeal a controller’s denial of a request to exercise one of the rights above.

Under the Act, consumers can designate an authorized agent to submit opt out requests on their behalf, but not requests to correct, delete, or access information about, or obtain a copy of, their personal data processed by the controller.

Consumers are entitled to at least one free request per year, after which the controller can charge a “reasonable fee” to cover that administrative cost of responding to requests that are “manifestly unfounded, excessive, or repetitive.”  Controllers are not required to respond to requests that they cannot authenticate, except for opt out requests, which do not have to be authenticated.

Key Obligations Under the Act

While most of the obligations apply to controllers, the Act also imposes some direct obligations on processors, including the requirement to assist the controller in meeting its obligations under the Act and to only process personal data in accordance with the controller’s instructions. A processor that processes personal data beyond the controller’s processing instructions will be deemed a controller under the Act (and subject to all of the controller obligations).

The key requirements under the Act include:

  • Privacy Notice: The Act requires controllers to provide consumers with a reasonably accessible, clear, and meaningful privacy notice that includes (1) the categories of personal data the controller processes; (2) the purpose for processing; (3) the categories of third parties to which the controller may disclose personal data; (4) the categories of personal data the controller shares with third parties; (5) how a consumer can exercise their privacy rights; (6) the process by which the controller will notify consumers of material changes to the privacy policy; and (7) an active email address or other online mechanism the consumer can use to contact the controller. 

In addition, controllers that sell personal data or process personal data for purposes of targeted advertising, sales, or automated profiling “in furtherance of decisions that produces legal or similarly significant effects concerning a consumer,” must “clearly and conspicuously disclose” such sales and processing and inform consumers of the manner in which they may opt out.

  • Data Protection Assessments: Like majority of existing state comprehensive privacy laws, the Act will require controllers to conduct and document a data protection assessment prior to processing personal data that presents a “heightened risk of harm” to consumers. The definition of heightened risk of harm includes, for example, processing personal data for targeted advertising purposes, selling personal data, processing sensitive data, and processing personal data for the purposes of profiling that presents a reasonably foreseeable risk of certain types of harm (e.g., unlawful disparate impact on consumers, or financial or physical injury).  Processors are required to provide information to the controller as necessary to enable the controller to conduct and document data protection assessments.
  • Consumer Privacy Requests: Under the Act, controllers have 45 days to respond to consumer rights requests, which may be extended for an additional 45 days where “reasonably necessary.”  Processors are required to implement appropriate technical and organizational measures to enable the controller to meet its obligations to respond to consumer privacy requests.
  • Consumer Consent: Under the Act, controllers must obtain consumer consent to process: (1) sensitive data; (2) personal data for purposes that are not reasonably necessary to or compatible with the purposes of collection and processing, as initially disclosed to the consumer; and (3) personal data of individuals between 13 and 17 years old for the purpose of selling the data, serving targeted advertising, or profiling the individual.  Controllers must also provide consumers a mechanism for revoking consent that is as easy as the mechanism for providing consent.
  • Universal Opt-Out Mechanism: Six months from the effective date, the Act requires controllers engaged in targeted advertising or the “sale” of personal data to allow consumers to exercise the right to opt out of such processing through a user-selected universal opt-out mechanism. Further details will be provided in the forthcoming rules and regulations.
  • Collection Limitation: Controllers must limit the collection of personal data to what is adequate, relevant, and reasonably necessary for the purposes disclosed to the consumer and may not process personal data for incompatible purposes without first obtaining consent.
  • Security and Confidentiality: The Act imposes security obligations on both controllers and processors.Controllers are required to establish and maintain administrative, technical, and physical data security measures “appropriate to the volume and nature of the personal data,” including measures to protect the confidentiality, integrity and accessibility of personal data and secure it from unauthorized acquisition “during both storage and use.”  Processors are required to ensure that persons that process personal data are subject to confidentiality obligations and to help controllers meet their obligations to provide data breach notices and maintain reasonable security.

In addition, the Act imposes a joint obligation on both controllers and processors to implement “technical and organizational security measures to ensure a level of security that is appropriate to the risk and establish a clear allocation of the responsibilities between them to implement the measures. 

  • Processor and Subcontractor Contracts: Controllers and processors are required to enter into a written contract that sets forth the processing instructions, identifies the type of personal data and duration of processing, requires the return or deletion of personal data at the end of the engagement, imposes obligations on the processor to demonstrate compliance to the controller and allow for and contribute to reasonable assessments by the controller, and includes other required terms.  Processors are also required to enter into written contracts with subcontractors binding them to comply with the obligations applicable to the processor.
  • Discrimination: Controllers are prohibited from discriminating against consumers for exercising their rights under the Act or from increasing the cost for, or decreasing the availability of, a product or service based “solely on the exercise of a right and unrelated to feasibility or the value” of the service.”

Enforcement

The Act will be enforced solely by the New Jersey Attorney General who may seek penalties of up to $10,000 for the first violation and up to $20,000 for the second and subsequent violations. There is no private right of action available under the Act.

For the first 18 months following the effective date of the Act (January 15th, 2025), there will be a 30-day cure period for violations.  During this time, the Division of Consumer Affairs must issue a notice of a violation to the controller “if a cure is deemed possible,” prior to bringing an enforcement action.  If the violation is not cured within 30 days, the Division of Consumer Affairs can then bring an enforcement action.   The right to cure only applies to violations by controllers—not processors. 


[1] While an earlier version of the bill included a definition for “business” that excluded non-profit entities this definition and exclusion were struck and are not included in the final version.

]]>
US: Open Banking Regulation Arrives in the US https://privacymatters.dlapiper.com/2024/01/us-open-banking-regulation-arrives-in-the-us/ Wed, 17 Jan 2024 19:36:48 +0000 https://privacymatters.dlapiper.com/?p=7191 Continue Reading]]> In 2010, Congress included a provision in the Consumer Financial Protection Act (CFPA) requiring that the Consumer Financial Protection Bureau (CFPB or Bureau) promulgate rules effectuating what is commonly referred to as “Open Banking.”   Specifically, the rules would require any entity that engages in offering or providing a consumer financial product or service to make available information concerning the financial products or services that the consumer received from the entity. However, it was not until October 2023 that the CFPB issued a proposed rule to implement the CFPA’s open banking/consumer financial data right.

In the intervening time, in the European Union put into place the Payment Services Directive 2 (PSD2) which mandated that banks open their data to third parties with consumer consent.  

This post provides a brief overview of the CFPB’s proposal (the public comment period for the proposed rule period closed on December 29, 2023, and the final rule can be expected in the coming months) and compares the rule’s requirements to those in PSD2.

Who is covered

The CFPB’s proposed rule takes a phased approach to making consumer financial data available to both consumers and authorized third parties, applying to a limited set of data providers and covered financial products, services, and information. The proposed rule would apply to the following types of financial products and services: (1) demand deposit (checking), savings, or other consumer asset accounts subject to Regulation E, (2) credit cards subject to Regulation Z, and (3) payment facilitation  from  Regulation E accounts or Regulation Z credit cards (such as  digital wallet services). The proposed rule defines “data providers” as covered persons subject to the CFPA that are financial institutions as defined by Regulation E (e.g., banks, savings associations, credit unions), credit card issuers as defined by Regulation Z, or any other persons that control or possess information concerning a covered consumer financial product or service the consumer obtained from those persons. While the proposal does not cover other types of consumer financial products or services, such as mortgages, student loans, or other closed end lending products, the CFPB intends to conduct supplemental rule-making proceedings to expand the scope of the open banking rule.

For purposes of the proposed rule, a “consumer” means a natural person; the term also includes trusts that are established for tax or estate planning purposes (which notably are not included under current US federal financial privacy law). A “third party” is any person or entity that is not the consumer about whom the covered data pertains or the data provider that controls or possess the consumer’s covered data. A “data aggregator” is an entity that is retained by and provides services to an authorized third party to enable access to covered data.

Requirements for data providers

Under the proposed rule, when a data provider receives a request from a consumer or an authorized third party for “covered data” in the data provider’s possession or control, the data provider must make the covered data available in an electronic machine-readable file that consumers and authorized third parties can retain.

“Covered data” includes:

  • Transaction information (including 24 months of historical transaction information):  information about individual transactions, including payment amount, date, payment type, pending or authorized status, payee or merchant name, rewards credits, fees, finance charges.
  • Account balance: Available funds in an asset account or credit card balance.
  • Information to initiate payment to or from a Reg E account: Actual or Tokenized account and routing numbers used to initiate an ACH transaction.
  • Terms and conditions: Contractual terms under which data provider provides financial products or services to the consumer. Includes pricing (APR, APY, fees, other pricing), rewards program terms, and dispute resolution (i.e., arbitration) requirements.
  • Upcoming bill information: Payments scheduled through the data provider (e.g., recurring or scheduled online bill pay transactions), payments due to the data provider.
  • Basic account verification information: Name, address, email address, phone number associated with the covered financial product or service.

Data providers will need to establish a consumer interface to field consumer requests for covered data and a developer interface for covered data requests from authorized third parties and data aggregators. Notably, data providers will not be able to charge either consumers or authorized third parties fees to provide covered data or for developing or maintaining the respective interfaces. The rule would establish performance requirements for the developer interface (similar to a service level agreement in a contract); the developer interface must “properly respond” to 99.5% of the requests for covered data, and a data provider may not unreasonably restrict the frequency that it receives and responds to requests for covered data. Data providers’ developer interfaces also will need to comply with the Gramm Leach Bliley Act’s (GLBA) information security requirements.

While the proposed rule does not explicitly prohibit authorized third parties from using screen scraping methods to obtain covered data from data providers, the notice of proposed rulemaking discussion clearly describes screen scraping as a disfavored practice. The CFPB refers to screen scraping as a security risk (as it proliferates the use of consumers’ account credentials), a method that can cause over-collection of consumer data and lead to data inaccuracies, and a practice that can overburden data providers’ systems and increase their liability risks. The Bureau’s proposed rule clearly intends for the developer portals to be the primary way third parties and data aggregators access covered data from data providers.

Data providers will need to have reasonable written policies and procedures for complying with the rule and will need to make certain information publicly available, such as developer interface documentation (technical data to help third parties use the interface) and performance data for the developer portal.

Requirements for third parties and data aggregators

To obtain covered data on behalf of a consumer under the proposed rule, a third party must obtain a consumer’s authorization. The authorization must be clear and conspicuous and be separate from other materials or terms. To be valid, the authorization must be signed (electronic or written) by the consumer and include the following information:

  • The name of the third party that will be authorized to access covered data.
  • The name of the data provider that controls or possesses the covered data that the third party seeks to access on the consumer’s behalf.
  • A brief description of the product or service that the consumer has requested the third party provide and a statement that the third party will collect, use, and retain the consumer’s data solely for the purpose of providing that product or service to the consumer.
  • The categories of covered data that will be accessed.
  • A statement that the third party certifies that it agrees that it will limit its collection, use, and retention of covered data to what is reasonably necessary to provide the consumer’s requested product or service.
  • A description of the mechanism a consumer may use to revoke the authorization.
  • The name of any data aggregator that will assist the third party with accessing covered data and a brief description of the services the data aggregator will provide.

If a third party uses a data aggregator to obtain covered data, the data aggregator may provide consumers with the authorization notices and obtain consumer’s express consents on behalf of the third party. The data aggregator would need to provide its own certification to the consumer regarding compliance with the rule and the restrictions on the collection, use, and disclosure of the consumer’s covered data. The third party would still be responsible for complying with the rule’s authorization requirements.

A consumer’s authorization, unless revoked earlier, would remain in effect for 12 months. Third parties will need to obtain fresh consumer authorizations every 12 months. When an authorization expires, a third party may no longer collect covered data from a data provider, and the third party may no longer use or retain covered data that was collected pursuant to the expired authorization unless the retention of that covered data remains reasonably necessary to providing the consumer’s requested product or service.

As described in the authorization form requirements above, a third party may only collect, use, and disclose covered data as reasonably necessary to provide a specific product or service that a consumer requested. The proposed rule would not permit covered data to be collected, used, or disclosed for any secondary purposes, and the proposal expressly prohibits collecting, using, or disclosing covered data for targeted advertising purposes, to cross-sell other products or services, or for any sale of covered data.

The systems that a third party uses to collect, use and retain covered data would need to comply with GLBA’s information security requirements; if a third party is not already subject to GLBA’s security requirements, it would need to comply with the prescriptive security requirements of the Federal Trade Commission’s Safeguards Rule, 16 CFR Part 314.

Third parties would need to maintain their own internal written policies on procedures to comply with the rule and the rule’s record retention requirements.

Industry standards and compliance dates

The proposed rule contemplates data providers’ and third parties’ compliance with qualified industry standards issued by a CFPB recognized standard-setting body could be used as indicia of compliance with the rule. However, the proposed rule did not identify or discuss any existing industry standards that could meet the Bureau’s requirements.

The proposed rule sets forth staggered compliance deadlines for data providers; the proposed rule does not include any compliance deadlines for third parties. The data provider compliance dates turn on the size of the respective data provider.

  • The largest data providers, depository institutions with at least $500 billion in total assets and nondepository institutions that generated at least $10 billion in revenue in the preceding calendar year or that are projected to do so in the current calendar year, would need to comply with the rule within 6 months after the final rule is published in the Federal Register.
  • Depository institutions with at least $50 billion in total assets and non-depository institutions that generated less than $10 billion in revenue in the preceding calendar year or that are projected to generate less than $10 billion in revenue in the current calendar year, would need to comply with the rule within 1 year after the final rule is published.
  • Depository institutions with at least $850 million in total assets would need to comply with the rule within 2 1/2 years after the final rule is published in the Federal Register.
  • Depository institutions with less than $850 million in total assets would need to comply with the rule within 4 years after the final rule is published.

As the comment period has now closed, the CFPB is in the process of reviewing the comments it received from the public.

Comparison with EU PSD2 Requirements

The EU PSD2 provides rules to ensure legal certainty for consumers, merchants, and companies within the payment chain and modernizes the legal framework for the market for payment services. It introduced several novelties in the payment services field, creating new opportunities for payment service users and enhancing transparency.

In particular, the PSD2 gave Open Banking a stable regulatory framework by regulating new types of payment service providers which play a significant role in the Open Banking process: information service providers (AISPs) and payment initiation service providers (PISPs).

AISPs and PISPs offer value-added services to users by accessing their account data held by banks and other payment account providers, upon users’ request. While PISPs are able to initiate payment orders at the request of a user concerning the user’s payment account held at another payment service provider, AISPs gather and consolidate information on one or more payment accounts held by the user either with another payment service provider or with more than one payment service provider, thus allowing the user to have an overall view of their financial situation at any given moment.

Unlike the CFPB’s proposed rule, which applies to financial data relating to “consumers,” the PSD2 applies to payment service providers which target both individuals and legal entities. However, similar to the CFPB’s proposed rule, the PSD2 excludes some financial product and service providers from its scope, such as services that entail creditworthiness assessments of the payment service user or audit services performed based on the collection of information via an account information service as well as accounts other than payment accounts (e.g., savings, investments).

, The PSD2 states that any processing of personal data, including the provision of information about the processing, for the purposes of the PSD2, shall be carried out in accordance with the General Data Protection Regulation (GDPR). However, the interplay of the GDPR and the PSD2 generated several uncertainties which the European Data Protection Board (EDPB) has partially addressed in its Guidelines 06/2020 on the interplay of the Second Payment Services Directive and the GDPR.

General requirements for AISPs and PISPs under the EU PSD2

The PSD2 regulates the legal conditions under which PISPs and AISPs may access payment accounts to provide their services to users and imposes obligations vis-à-vis banks and other credit institutions holding users’ information.

  • User consent and mandatory sharing of user information

The PSD2 provides that the access and use of payment and account information services are rights of the user, meaning that the providers holding such information (usually banks) must share users’ information with PISPs and AISPs, upon users’ explicit consent.

Nonetheless, the PSD2 does not set an “expiration date” on this authorization. It merely provides that payment service providers shall only access, process, and retain personal data necessary to provide their payment services, with the user’s explicit consent.

This explicit consent should be regarded as an additional requirement of a contractual nature about the access to and subsequent processing and storage of personal information to provide payment services and is, therefore, not the same as express consent under the GDPR. When entering into a contract with a payment service provider under the PSD2, users must be made fully aware of the specific categories of personal information that will be processed. Further, they must be made aware of the specific (payment service) purpose for which their personal information will be processed and must explicitly agree to these clauses. Such clauses should be clearly distinguishable from other matters dealt with in the contract and must be explicitly accepted by the users (similar to the CFPB’s proposed requirement that a consumer’s authorization be separate from any other terms).

Further transparency obligations on the purpose for which users’ data is processed arise from the GDPR.

  • Secondary use and purpose limitation

Ever since the PSD2 entered into force in 2016, the provision of ancillary value-added service has become the primary business model for the AISPs and PISPs operating on the European market. It is common for PISPs, in addition to enabling payments to merchants on the Internet, to offer ancillary services such as reloading funds to prepaid cards or paying commercial invoices, while AISPs may also offer services aimed at improving customers’ financial habits by planning expenses and savings or supporting credit scoring processes.

However, similar to the prohibition of secondary use envisaged by the CFPB’s proposed rule, the PSD2 considerably restricts how AISPs and PISPs may process data for purposes other than providing the service requested by the user. This limitation must be read in conjunction with the principle of purpose limitation set forth by the GDPR, with the result that the processing for another purpose is not allowed unless the user has given consent under the GDPR or the processing is required by EU or member state law to which the AISP or PISP is subject (e.g., anti-money laundering purposes).

As a result, AISPs and PISPs must collect end-customer consents distinguishing between those required under PSD2 (i.e., consents allowing access to users’ data by intermediaries offering PIS and AIS services) and those necessary under the GDPR (i.e., consents allowing the extracted information to be transferred to other parties or processed to pursue purposes other than those strictly necessary to provide payment services).

Therefore, the PSD2 considerably restricts the possibilities for processing users’ data for other purposes incompatible with the one for which this data is initially collected.

  • Data minimization 

The PSD2 considerably restricts the ability of AISPs and PISPs to collect data beyond the minimum necessary to provide the service requested by the user, meaning that data collection and subsequent processing shall be limited to the strictly necessary, consistently with the data minimization principle set forth by the GDPR.

For instance, AISPs’ access is limited to the information from designated payment accounts and associated payment transactions., They shall not use, access, or store any data for purposes other than for performing the account information service explicitly requested by the user, in accordance with the GDPR, which emphasizes that personal data can only be collected for specified, explicit, and legitimate purposes.

Therefore, an AISP should make explicit in the contract the specific purposes for which personal account information will be processed, in the context of its account information service.

  • Data retention 

Unlike the CFPB’s proposed rule, PSD2 does not envisage data retention terms but GDPR does. Besides collecting the minimum amount of data possible, the service provider must also envisage limited retention periods: open-ended retention terms or permanent data storage are generally incompatible with the GDPR. As such, personal data should only be stored by the service provider for a period related to the purposes requested by the payment service user.

  • Implementation of security measures and identification requirements

The implementation of adequate security measures is directly addressed by the PSD2, which, similar to the CFPB’s proposed rule, relies on standards promulgated by a dedicated body.

Under the PSD2, PISPs and AISPs, on the one hand, and the account servicing payment service provider, on the other hand, should observe the necessary data protection and security requirements established by, or referred to in, the PSD2 or included in the regulatory technical standards.

The latter are included in the Commission Delegated Regulation (EU) 2018/389, which sets forth regulatory technical standards for strong customer authentication and common and secure open communication standards.

Moreover, the GDPR always requires organizations to implement appropriate security measures, considering the state of the art, the costs of implementation, and the nature, scope, context, and purposes of processing, as well as the likelihood and severity of risks to the rights and freedoms of natural persons.

The upcoming revision of the EU PSD2

Although the PSD2 has significantly contributed to the development of the European payments market and improved customer protection and the efficiency, transparency, and choice of payment instruments, a few shortcomings have emerged, such as regulatory deficiencies regarding new players and services in the payments market, divergences in implementation across Member States, and unclear alignment with other EU legislation.

To address these issues, the European Parliament has published a proposal to revise PSD2, comprised of a proposal for a directive on payment services and electronic money services (PSD3) and a proposal for a regulation of payment services in the internal market (PSR).

]]>
US: Regulators Enhance Information Security Requirements for Financial Services Companies https://privacymatters.dlapiper.com/2023/11/us-regulators-enhance-information-security-requirements-for-financial-services-companies/ Mon, 06 Nov 2023 22:30:42 +0000 https://privacymatters.dlapiper.com/?p=7151 Continue Reading]]> Sweeping Amendments to NYDFS Cybersecurity Regulation

On November 1, 2023, the New York Department of Financial Services (NYDFS) announced extensive amendments to its cybersecurity requirements for financial institutions issued under 23 NYCRR Part 500.  The amendments are intended to address the evolution in the cybersecurity landscape since the regulation was first enacted in 2017, including the increasing sophistication of threat actors and improvements in the tools available for organizations to protect themselves. Covered entities continue to include entities operating under, or required to operate under, a license, registration, charter, certificate, permit, accreditation or similar authorization under NY Banking Law, Insurance Law or Financial Services Law.

Key changes in the amended regulation include:   

  • Creating a new class of covered entities (based on revenue and/or employee thresholds) that are subject to heightened requirements;
  • Enhancing requirements related to vulnerability management, access controls, and the use of encryption;
  • Providing prescriptive requirements related to the use of multi-factor authentication;
  • Requiring the implementation of policies and procedures related to business continuity and disaster recovery;
  • Requiring additional controls to prevent unauthorized access to information systems;     
  • Updating cybersecurity incident notification requirements, including a new requirement to report ransomware payments; and
  • Amending the scope of the exemptions and enforcement provisions under the regulation.

The amended requirements will take effect in phases, with some having already come into force on November 1, 2023.

FTC Implements Security Incident Notification Requirement under Safeguards Rule

In other financial services information security developments, the Federal Trade Commission (FTC) issued a final rule creating a security incident notification requirement under its Gramm Leach Bliley Act (GLBA) Safeguards Rule. The FTC’s Safeguards Rule implements GLBA’s security requirements, with the FTC having Safeguards Rule jurisdiction over mortgage lenders, certain non-bank lenders, finance companies, mortgage brokers, account services, check cashers, wire transferors, collection agencies, credit and financial advisors, tax preparation firms, and investment advisors that are not required to register with the Securities and Exchange Commission.

Under the final rule, covered financial institutions must electronically notify the FTC within 30 days of discovering a “notification event” that involves the information of at least 500 consumers. The scope of data and incidents that could be subject to the rule is very broad. A notification event is defined as the “acquisition of unencrypted customer information without the authorization of the individual to which the information pertains.” The Safeguards Rule broadly defines “customer information” as any record containing nonpublic personal information about a [consumer] customer of a financial institution, whether in paper, electronic, or other form.” For example, the fact that a consumer is a financial institution’s customer would itself be customer information subject to the rule. The definition of “notification event” also presumes that customer information was “acquired” if there was unauthorized access to such information; to rebut this presumption, a financial institution must have reliable evidence showing that there has not been or could not reasonably have been unauthorized acquisition. The rule does not include a good faith exception like the U.S. state security breach notification laws for situations where an employee or contractor mistakenly accesses or acquires personal information.

The rule will take effect 180 days after its publication in the Federal Register. The FTC will post the notifications it receives publicly on its website.

For more information, please contact your DLA relationship Partner, the authors of this blog post, or any member of our Data Protection team.

]]>