| Privacy Matters https://privacymatters.dlapiper.com/category/direct-marketing/ DLA Piper's Global Privacy and Data Protection Resource Mon, 03 Feb 2025 09:17:17 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif | Privacy Matters https://privacymatters.dlapiper.com/category/direct-marketing/ 32 32 UK: Google’s U-Turn on Device Fingerprinting: ICO’s Response and Subsequent Guidance https://privacymatters.dlapiper.com/2025/01/googles-u-turn-on-device-fingerprinting-icos-response-and-subsequent-guidance/ Thu, 30 Jan 2025 18:25:52 +0000 https://privacymatters.dlapiper.com/?p=7540 Continue Reading]]> In a December, the Information Commissioner’s Office (ICO) responded to Google’s decision to lift a prohibition on device fingerprinting (which involves collecting and combining information about a device’s software and hardware, for the purpose of identifying the device) for organisations using its advertising products, effective from 16 February 2025 (see an overview of Google’s new Ads Platforms policies here). This follows Google’s previous decision in July 2024 to keep third party cookies.

In its response, the ICO criticized Google’s decision to permit device fingerprinting for advertising purposes as “irresponsible” and emphasised that device fingerprinting:

  1. Requires Consent: device fingerprinting enables devices to be identified even where cookies are blocked or the location is disguised, hence its common use for fraud prevention purposes, but the ICO reinforced that it is subject to the usual consent requirements.
  2. Reduces User Control: Despite various browsers now offering “enhanced” tracking protection, the ICO stated that device fingerprinting is not a fair means of tracking users online as it diminishes people’s choice and control over how their information is collected.

This statement echoes concerns previously voiced by Google who had stated that device fingerprinting “subverts user choice and is wrong”.

With the potential for fingerprinting to replace the long-debated third-party (3P) cookie functionality, this statement forms part of a shift in regulatory focus to technologies beyond cookies. Various technologies have recently received greater scrutiny, both in the ICO’s Draft Guidance on the use of storage and access technologies | ICO (“ICO’s Draft Guidance“) – interestingly issued in December 2024 to coincide with the Google update – and the European Data Protection Board (EDPB) Guidelines 2/2023 on Technical Scope of Art. 5(3) of ePrivacy Directive.

ICO Draft Guidance: Key Takeaways

The ICO’s Draft Guidance explores the practical application of the Privacy and Electronic Communications Regulations (PECR) requirement that consent must be obtained by the user for any storage or access of information on/from a device (‘terminal equipment’), unless such storage/access is strictly necessary for the purposes of a communication or to provide a service requested by the user.

In particular, the Draft Guidance addresses the following areas which are explored further in their respective sections below:

Technologies

The ICO’s Draft Guidance looks at how and why the rules relating to storage and access of device information apply to various types of technologies used in web browsers, mobile apps or connected devices, namely: Cookies; Tracking Pixels, Link Decoration and Navigational Tracking, Web Storage, Scripts and tags, and Fingerprinting techniques. The technologies focused on by the ICO overlap to a large extent with those examples used by the EDPB in their guidelines. However, taking the analysis on pixels as an example, the EDPB suggests that any distribution of tracking links/pixels to the user’s device (whether via websites, emails, or text messaging systems) is subject to Regulation 5(3) of the ePrivacy Directive as it constitutes ‘storage’ even if only temporarily via client-side caching.  The ICO’s guidance is less clear, suggesting that tracking pixels are only subject to Regulation 6 Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) when they store information on the user’s device. This might imply a less expansive view compared to the EDPB, highlighting the importance of remaining alive to jurisdictional nuances for any global tracking campaigns.

Detailed Consent Requirements

The ICO reiterates that for a PECR consent to be valid, it must meet UK GDPR standards (freely given, specific, informed and unambiguous statement of the individual’s wishes indicated by a clear affirmative action).

    The ICO highlights the fact that the consent must be provided by the data subject where personal data is processed (this contrasts with the PECR user/subscriber consent requirement) – this tension is an existing issue, but quite how the party collecting the cookie consent for personal data processed via cookies (or a similar technology) is supposed to know whether the user of a device has changed, without either requiring re-consent or user identification on each visit (or carrying out background identification using user fingerprinting or similar, which means more data processing and may be intrusive) is unclear.

    In line with recent ICO statements in relation to the lack of ‘reject all’ options, the ICO emphasises that subscribers/users must be able to refuse the use of storage and access technologies as easily as they can consent. Additional points of interest for controllers include:

    • That users must have control over any use of non-essential storage and access technologies. While this could, on a conservative reading, be interpreted as needing US-style granular per-cookie consent, the examples provided suggest high-level consent mechanisms expressed per category (e.g., analytics, social media tracking, marketing) are still acceptable;
    • Clarification that you must specifically name any third parties whose technologies you are requesting consent to (this information can be provided in a layered fashion provided this is very clear). However, if controls are not required at an individual cookie level, which seems to be the case, then this becomes less meaningful for data subjects who cannot act on this additional information as they only have the choice of rejecting all storage and access technologies for each purpose category (e.g. all analytics cookies/technologies) rather than a relevant third party; and
    • Clarification that users must be provided with controls over any use of storage and access technologies for non-essential purposes (albeit this was arguably already required in order to facilitate withdrawal of consent/changing of preferences on an ongoing basis).

    Exemptions to consent: Strictly Necessary

    Leaving aside technologies necessary for communications, the ICO emphasises that the “strictly necessary” exemption applies when the purpose of the storage or access is essential to provide the service the subscriber or user requests. Helpfully, the ICO Draft Guidance clarifies that technologies used to comply with applicable law e.g. meeting security requirements, can be regarded as “strictly necessary”, such that no consent is required. This will not apply if there are other ways that you can comply with this legislation without using cookies or similar technologies.

    Other examples of activities likely to meet the exemption include: (i) ensuring the security of terminal equipment; (ii) preventing or detecting fraud; (iii) preventing or detecting technical faults; (iv) authenticating the subscriber or user; and (v) recording information or selections made on an online service.

    One area of ambiguity remains in relation to fraud prevention and detection. In the financial services sector, websites/apps often use third-party fingerprinting for fraud detection (in order to meet legal obligations to ensure the security of their services).  ‘Preventing or detecting fraud’ is listed as an example of an activity likely to meet the exemption, whilst third party fingerprinting for fraud prevention is used by the ICO as an example of an activity subject to Article 6 PECR, with the implication that consent is needed (albeit this is not stated). However, the DUA Bill (if passed in its current form) provides some helpful clarity here, as it states that use of such technologies should be regarded as “strictly necessary” where used to protect information, for security purposes, to prevent or detect fraud or technical faults, to facilitate automatic authentication, or to maintain a record of selections made by the user.

    Interestingly, the guidance suggests that the use of social media plugins/tools by logged-in users might be strictly necessary, though this does not extend to logged-out users, users who are not a member of that network, or any associated tracking.

    Governance and compliance

    A number of the ICO’s clarifications are likely to impact day to day resourcing and operations for any organisation using material numbers of storage and access technologies:

    • Governance: the ICO emphasises what it expects in respect of governance of storage and access requirements, including an audit checklist, emphasising the need to regularly audit the use of such technologies and ensure that the rest of the consent ecosystem (including transparency, consent, data sharing, and subsequent processing) is consistent and up to date. This is likely to be resource intensive, and few organisations will be set up for this level of assurance.
    • Transparency:  The ICO guidance reinforces the need for transparency around whether any third parties will store/access information on the user’s device or receive this information, making clear that all third parties providing cookies or receiving data must be named (avoiding ambiguous references to “partners” or “third parties.”), and that specific information must be provided about each, taking into account UK GDPR considerations where personal data is processed. This will be a considerable challenge for complex ecosystems, most notably in the context of online advertising (albeit this has been a known challenge for some time).
    • Consent Ecosystem: The guidance makes very clear that a process must be in place for passing on when a user withdraws their consent. In practice, the entity collecting the consent is responsible for informing third parties when consent is no longer valid. This is crucial but challenging to comply with, and is again perhaps most relevant in the context of online advertising. 
    • Subsequent Processing: as it has done in the past, the ICO continues to strongly suggests that any subsequent processing of personal data obtained via storage/access technologies on the basis of consent should also be based on consent, going as far as to suggest that reliance on an alternative lawful basis (e.g. legitimate interest) may invalidate any initial consent received.

    Conclusion

    As device fingerprinting and other technologies evolve, it is crucial for organisations to stay informed and ensure compliance with the latest guidance and consider that there may be nuance between regulation in EU / UK.

    The ICO’s Draft Guidance provides helpful clarity on existing rules in the UK, including detailed examples of how to conduct cookie audits, but does not otherwise provide practical guidance on how to overcome many of the operational privacy challenges faced by controllers (such as monitoring changing users and managing consent withdrawals within online advertising ecosystems).

    With increasing regulatory commentary and action in this space, including the ICO’s most recent announcement regarding its focus on reviewing cookie usage on the biggest UK sites, now is the time to take stock of your tracking technologies and ensure compliance!

    The ICO’s Draft Guidance is currently open for consultation, with input sought by 5pm on Friday 14th March 2025. If you have any questions or would like to know more, please get in touch with your usual DLA contact.

    ]]>
    California Attorney General Settles with DoorDash over Alleged Sale of Personal Information https://privacymatters.dlapiper.com/2024/02/california-attorney-general-settles-with-doordash-over-alleged-sale-of-personal-information/ Fri, 23 Feb 2024 01:17:57 +0000 https://privacymatters.dlapiper.com/?p=7231 Continue Reading]]> Overview

    On February 21, 2024, the California Attorney General (CA AG) announced that it had reached a settlement with DoorDash over allegations that the company failed to comply with “sale” requirements under the California Consumer Privacy Act (CCPA) and disclosure requirements under the California Online Privacy Protection Act (CalOPPA). The settlement requires DoorDash to pay a $375,000 civil penalty and comply with specific injunctive terms.

    The CA AG’s complaint alleges that DoorDash participated in marketing co-operatives (“co-ops”) that involved the company providing its customers’ personal information (such as names, addresses, and transaction histories) to the co-op without providing its customers with notice or an opportunity to opt-out of the sale. Upon receiving DoorDash’s customer personal information, the co-op would combine DoorDash’s customer data with the customer data of other third-party co-op members, analyze the data, and allow members to send mailed advertisements to potential leads. The CA AG considered such data disclosure a “sale” of personal information under the CCPA’s broad definition of that term. Specifically, DoorDash received “valuable consideration” in exchange for disclosing its customer data to the co-op, namely the “opportunity to advertise its services directly to the customers of the other participating companies.”

    The CA AG’s second cause of action invoked CalOPPA, a 20-year-old California privacy law that imposes transparency obligations on companies that operate websites for commercial purposes and collect personally identifiable information from Californians. The complaint alleged violations of CalOPPA by DoorDash due to the company’s failure to disclose in its privacy policy that it would share its customers’ personally identifiable information with other third-party businesses (e.g., marketing co-op members) for those businesses to contact DoorDash customers with ads.

    Key Takeaways

    This settlement serves as a critical reminder of the importance of compliance with current and emerging state privacy laws, emphasizing the broad definition of “sale” under the CCPA and the strict requirements for transparency and consumer choice. Additionally, we expect the California Privacy Protection Agency, another California privacy regulator (vested with full administrative power, authority, and jurisdiction to implement and enforce the CCPA) to ramp up its own investigative and enforcement efforts this year. Thus, businesses should consider the following:

    • “Selling” is Broader than Cookies – companies should re-assess how their data disclosure activities may be considered “selling” under the CCPA. Many companies focus on the use of third-party ad and analytics cookies on their websites as the main trigger for “sale” compliance obligations under the law. This settlement makes clear that companies should broaden their review and assessment of their marketing department’s use of personal information to consider non-cookie related data disclosures.
    • Review and Update Privacy Policies – an outdated, unfair and deceptive, or misleading privacy policy serves as an online billboard announcing a company’s non-compliance with state privacy laws as well as state unfair competition laws (such as for example California’s Unfair Competition Law (UCL)). As this settlement demonstrates, this can be a magnet for consumer complaints and regulatory scrutiny (including at the federal level under Section 5 of the Federal Trade Commission Act). Companies should continually review and update their privacy policies if they materially change how they handle personal information. Under the CCPA, privacy policies must be updated at least annually.
    • Opt-Out Mechanisms. Companies should ensure that compliant opt-out mechanisms, including an interactive webform and a “Do Not Sell or Share My Personal Information” or “Your Privacy Choices” link, are in place. Opt-out mechanisms must also recognize and respond to universal opt-out preferences signals, such as the Global Privacy Control (GPC) signal.   
    • Don’t Forget the Apps – the complaint noted that both the DoorDash website and mobile application (App) failed to inform consumers about the sale of their personal information and their right to opt-out. Companies that collect personal information via an App and engage in “backend” selling of personal information should ensure that the App includes sufficient CCPA disclosures and a mechanism for users to easily opt-out of the sale of their personal information (see here for the CA AG’s previous announcements of an investigative sweep focused on violations of CCPA in the App context).
    • Marketing Co-Ops – this enforcement action makes clear the California regulators consider a company’s participation in a marketing co-operative to be a “sale” under the CCPA. Companies participating in marketing co-ops and other third-party data sharing engagements should carefully review their agreements with the data recipients to ensure they restrict the recipients’ ability to further disclose or sell consumer personal information.

    For more information about these developments and the CCPA in general, contact your DLA relationship Partner, the authors of this blog post, or any member of DLA’s Data, Privacy and Cybersecurity team.

    ]]>
    End of Meta’s targeted ads model? https://privacymatters.dlapiper.com/2022/12/end-of-metas-targeted-ads-model/ Fri, 09 Dec 2022 11:05:27 +0000 https://blogs.dlapiper.com/privacymatters/?p=3733 Continue Reading]]> Authors: Ewa Kurowska-Tober, Andrew Serwin, John Magee and Madison Swoy

    A trio of forthcoming decisions against tech giant Meta may signal the end for Meta’s targeted ads model, though the issue is likely to rumble on for some time.

    For many years, Meta has relied on contractual necessity (Article 6(1)(b) of the GDPR) as a legal basis for the processing of its users’ personal data in order to present personalised ads to them on the company’s platforms, such as Facebook or Instagram. This seemed to be the most suitable legal basis because it is probable that many users would refuse to allow the processing of their data if Meta relied on their consent or they would object to the processing if Meta used its own legitimate interest as the basis for doing so. Meta allows its users to opt out of targeted ads, which are based on personal data obtained from the websites and apps of third parties, but it does not offer a similar option in the case of ads based on data collected through its own platforms. However, European privacy regulators are currently looking closely at Meta’s practices and this approach may soon have to be changed.

    On Tuesday, 6 December, it was reported that the European Data Protection Board (EDPB) has approved three decisions in proceedings following three complaints made against Facebook, Instagram, and WhatsApp concerning their use of targeted ads. Like many other American tech giants (e.g. Google and Apple), Meta’s European subsidiary is established in Ireland and its lead supervisory authority is the Irish Data Protection Commission (DPC). However, Meta’s data processing activities affect users in all EU Member States, and therefore other European data protection authorities and the EDPB also have a say on the decisions, under the GDPR’s consistency mechanism.

    According to the three decisions, which have not yet been communicated to the public, Meta will have to stop relying on its terms of service as a justification for the use of targeted ads on its platforms. The decisions may be appealed, which means that Meta is likely to have several more years to continue with this approach, while at the same time develop alternative ad-displaying models. If the decisions are upheld by the Irish courts, many users are likely to opt out of the targeted ads, which account for a sizeable part of Meta’s revenue. The company argues that its model of personalising ads is necessary for the provision of its services and does not deprive the users of control over how their personal data is used, since they are free to decide whether they want to continue using Meta’s services.

    Meta’s troubles were immediately recognised by investors and the value of the company’s shares fell 6.2% in mid-session trading on the day when the EDPB’s position on the upcoming decisions was reported. This adds to Meta’s long list of privacy-related problems – the DPC has already fined it EUR 405 million for a violation of children’s privacy by Instagram, EUR 265 million for a Facebook data-scraping breach, and EUR 17 million for a string of security lapses by Facebook. Facebook has also been hit with a EUR 60 million fine for cookie consent violations by the French data protection authority (CNIL) and faces a potential fine and suspension order for transferring its users’ data to the United States, which may be issued by the DPC following the long-running series of complaints and proceedings initiated by privacy activist Max Schrems.

    Meta’s revenues have also been significantly affected by growing competition from the Chinese video-sharing platform TikTok, which is gaining popularity among younger users, and Apple’s decision to give iPhone users a choice of whether they want their activities in third-party apps like Facebook or Instagram to be tracked.

    Considering that providing personalised ads has been the core of Meta’s business for many years and that they represent the most significant source of its revenue, the DPC’s decisions may be the most serious blow the company has ever suffered. They will impact not only Meta’s position on the EU market, but also the activities of other digital platforms which rely to a large extent on delivering targeted ads to their users.

    Implications Around the World

    Ireland. The decisions, once finalised and issued by the DPC, signal an increasing hardening of approach by the DPC against Meta and potentially other social media businesses for which the DPC acts as lead supervisory authority under the GDPR’s one-stop shop enforcement mechanism. While the DPC has been subject to some criticism around Europe for perceived delays in enforcement action against the tech giants, the series of decisions demonstrate the GDPR’s complex consistency mechanism in action, which was designed to take account of the concerns of all supervisory authorities. GDPR remains at an early stage and the series of decisions, each of which touch on novel issues of law, are likely to be appealed through the Irish courts.

    United States. GDPR has served as a model for new data privacy laws across the United States and many U.S. companies are beginning to use the GDPR as a baseline to ensure compliance with the patchwork of state data privacy laws. Nonetheless, the U.S. state data privacy laws have almost completely eschewed the GDPR notion of “legal bases” for processing so the impact on U.S. users may not be substantial. However, we may see additional privacy controls roll out globally as a result of the EDPB’s decision, particularly around which advertisements users are shown in their feeds and Meta’s behavioral advertising model seems almost certain to take a massive hit company-wide.

    We are watching this case closely and will provide our comments on any significant developments.

    ]]>