| Privacy Matters https://privacymatters.dlapiper.com/category/tracking-technologies/ DLA Piper's Global Privacy and Data Protection Resource Wed, 12 Mar 2025 09:42:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif | Privacy Matters https://privacymatters.dlapiper.com/category/tracking-technologies/ 32 32 CHINA: Recent Enforcement Trends https://privacymatters.dlapiper.com/2025/03/china-recent-enforcement-trends/ Wed, 12 Mar 2025 09:42:03 +0000 https://privacymatters.dlapiper.com/?p=7564 Continue Reading]]> Recently, the Cyberspace Administration of China (CAC), which is the primary data regulator in China, published a newsletter about the government authorities’ enforcement of Apps and websites that violated personal data protection and cybersecurity laws during the year 2024.

Based on the official statistics, during 2024, the CAC interviewed 11,159 website platforms, imposed warnings or fines on 4,046 website platforms, ordered 585 websites to suspend or update relevant functions, took down 200 Apps and took administrative actions on 40 mini-programs. The CAC also conducted joint enforcement actions together with the Ministry of Industry and Information Technology and revoked the licenses or shut down 10,946 websites and closed 107,802 accounts.

The following violations are of particular concern to these enforcement activities:

  • Failure to maintain relevant network logs as required by law or to promptly address security risks (such as system vulnerabilities), resulting in illegal and regulatory issues such as system attacks, tampering, and data leaks;
  • Failure to clearly display privacy notices in Apps, obtain necessary consent to process personal data, or provide convenient methods to opt out or de-register accounts;
  • Failure to conduct required recordal or filing for AI models or features built into Apps or mini-apps; and
  • Unreasonably requiring consumers to scan QR codes or perform facial recognition that is not necessary to provide the underlying services.

Around the same time, the National Computer Virus Emergency Response Center, which is an institution responsible for detecting and handling computer virus outbreaks and cyber attacks under the supervision of the Ministry of Public Security, published a list Apps that violated the personal data protection laws in the following areas:

  • Failure to provide data subjects with all the required information about the processing (e.g. name and contact details of the controller, categories of personal data processed, purposes of the processing, retention period, etc.) in a prominent place and in clear and understandable language; in particular, failure to provide such information about any third party SDK or plugin is also considered a breach of the law;
  • Failure to provide data subjects with the required details about any separate controller (e.g. name, contact information, categories of personal data processed, processing purposes, etc.) or to obtain the separate consent of data subjects before sharing their personal data with the separate controller;
  • Failure to obtain the separate consent of data subjects before processing their sensitive personal data;
  • Failure to provide users with the App functions to delete personal data or de-register accounts, or to complete the deletion or deregistration within 15 business days; or setting unreasonable conditions for users to de-register accounts;
  • Failure to formulate special rules for processing the personal data of minors (under the age of 14) or to obtain parental consent before processing the personal data of minors; and
  • Failure to take appropriate encryption, de-identification and other security measures, taking into account the nature of the processing and its impact on the rights and interests of data subjects.

The above enforcement focuses are also consistent with the audit points highlighted in the newly released personal data protection audit rules (see our article here). We expect the same enforcement trend to continue into 2025. Companies that process personal data in China or in connection with business in China are advised to review their compliance status with the requirements of Chinese law and take remedial action in a timely manner.

]]>
UK: Google’s U-Turn on Device Fingerprinting: ICO’s Response and Subsequent Guidance https://privacymatters.dlapiper.com/2025/01/googles-u-turn-on-device-fingerprinting-icos-response-and-subsequent-guidance/ Thu, 30 Jan 2025 18:25:52 +0000 https://privacymatters.dlapiper.com/?p=7540 Continue Reading]]> In a December, the Information Commissioner’s Office (ICO) responded to Google’s decision to lift a prohibition on device fingerprinting (which involves collecting and combining information about a device’s software and hardware, for the purpose of identifying the device) for organisations using its advertising products, effective from 16 February 2025 (see an overview of Google’s new Ads Platforms policies here). This follows Google’s previous decision in July 2024 to keep third party cookies.

In its response, the ICO criticized Google’s decision to permit device fingerprinting for advertising purposes as “irresponsible” and emphasised that device fingerprinting:

  1. Requires Consent: device fingerprinting enables devices to be identified even where cookies are blocked or the location is disguised, hence its common use for fraud prevention purposes, but the ICO reinforced that it is subject to the usual consent requirements.
  2. Reduces User Control: Despite various browsers now offering “enhanced” tracking protection, the ICO stated that device fingerprinting is not a fair means of tracking users online as it diminishes people’s choice and control over how their information is collected.

This statement echoes concerns previously voiced by Google who had stated that device fingerprinting “subverts user choice and is wrong”.

With the potential for fingerprinting to replace the long-debated third-party (3P) cookie functionality, this statement forms part of a shift in regulatory focus to technologies beyond cookies. Various technologies have recently received greater scrutiny, both in the ICO’s Draft Guidance on the use of storage and access technologies | ICO (“ICO’s Draft Guidance“) – interestingly issued in December 2024 to coincide with the Google update – and the European Data Protection Board (EDPB) Guidelines 2/2023 on Technical Scope of Art. 5(3) of ePrivacy Directive.

ICO Draft Guidance: Key Takeaways

The ICO’s Draft Guidance explores the practical application of the Privacy and Electronic Communications Regulations (PECR) requirement that consent must be obtained by the user for any storage or access of information on/from a device (‘terminal equipment’), unless such storage/access is strictly necessary for the purposes of a communication or to provide a service requested by the user.

In particular, the Draft Guidance addresses the following areas which are explored further in their respective sections below:

Technologies

The ICO’s Draft Guidance looks at how and why the rules relating to storage and access of device information apply to various types of technologies used in web browsers, mobile apps or connected devices, namely: Cookies; Tracking Pixels, Link Decoration and Navigational Tracking, Web Storage, Scripts and tags, and Fingerprinting techniques. The technologies focused on by the ICO overlap to a large extent with those examples used by the EDPB in their guidelines. However, taking the analysis on pixels as an example, the EDPB suggests that any distribution of tracking links/pixels to the user’s device (whether via websites, emails, or text messaging systems) is subject to Regulation 5(3) of the ePrivacy Directive as it constitutes ‘storage’ even if only temporarily via client-side caching.  The ICO’s guidance is less clear, suggesting that tracking pixels are only subject to Regulation 6 Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) when they store information on the user’s device. This might imply a less expansive view compared to the EDPB, highlighting the importance of remaining alive to jurisdictional nuances for any global tracking campaigns.

Detailed Consent Requirements

The ICO reiterates that for a PECR consent to be valid, it must meet UK GDPR standards (freely given, specific, informed and unambiguous statement of the individual’s wishes indicated by a clear affirmative action).

    The ICO highlights the fact that the consent must be provided by the data subject where personal data is processed (this contrasts with the PECR user/subscriber consent requirement) – this tension is an existing issue, but quite how the party collecting the cookie consent for personal data processed via cookies (or a similar technology) is supposed to know whether the user of a device has changed, without either requiring re-consent or user identification on each visit (or carrying out background identification using user fingerprinting or similar, which means more data processing and may be intrusive) is unclear.

    In line with recent ICO statements in relation to the lack of ‘reject all’ options, the ICO emphasises that subscribers/users must be able to refuse the use of storage and access technologies as easily as they can consent. Additional points of interest for controllers include:

    • That users must have control over any use of non-essential storage and access technologies. While this could, on a conservative reading, be interpreted as needing US-style granular per-cookie consent, the examples provided suggest high-level consent mechanisms expressed per category (e.g., analytics, social media tracking, marketing) are still acceptable;
    • Clarification that you must specifically name any third parties whose technologies you are requesting consent to (this information can be provided in a layered fashion provided this is very clear). However, if controls are not required at an individual cookie level, which seems to be the case, then this becomes less meaningful for data subjects who cannot act on this additional information as they only have the choice of rejecting all storage and access technologies for each purpose category (e.g. all analytics cookies/technologies) rather than a relevant third party; and
    • Clarification that users must be provided with controls over any use of storage and access technologies for non-essential purposes (albeit this was arguably already required in order to facilitate withdrawal of consent/changing of preferences on an ongoing basis).

    Exemptions to consent: Strictly Necessary

    Leaving aside technologies necessary for communications, the ICO emphasises that the “strictly necessary” exemption applies when the purpose of the storage or access is essential to provide the service the subscriber or user requests. Helpfully, the ICO Draft Guidance clarifies that technologies used to comply with applicable law e.g. meeting security requirements, can be regarded as “strictly necessary”, such that no consent is required. This will not apply if there are other ways that you can comply with this legislation without using cookies or similar technologies.

    Other examples of activities likely to meet the exemption include: (i) ensuring the security of terminal equipment; (ii) preventing or detecting fraud; (iii) preventing or detecting technical faults; (iv) authenticating the subscriber or user; and (v) recording information or selections made on an online service.

    One area of ambiguity remains in relation to fraud prevention and detection. In the financial services sector, websites/apps often use third-party fingerprinting for fraud detection (in order to meet legal obligations to ensure the security of their services).  ‘Preventing or detecting fraud’ is listed as an example of an activity likely to meet the exemption, whilst third party fingerprinting for fraud prevention is used by the ICO as an example of an activity subject to Article 6 PECR, with the implication that consent is needed (albeit this is not stated). However, the DUA Bill (if passed in its current form) provides some helpful clarity here, as it states that use of such technologies should be regarded as “strictly necessary” where used to protect information, for security purposes, to prevent or detect fraud or technical faults, to facilitate automatic authentication, or to maintain a record of selections made by the user.

    Interestingly, the guidance suggests that the use of social media plugins/tools by logged-in users might be strictly necessary, though this does not extend to logged-out users, users who are not a member of that network, or any associated tracking.

    Governance and compliance

    A number of the ICO’s clarifications are likely to impact day to day resourcing and operations for any organisation using material numbers of storage and access technologies:

    • Governance: the ICO emphasises what it expects in respect of governance of storage and access requirements, including an audit checklist, emphasising the need to regularly audit the use of such technologies and ensure that the rest of the consent ecosystem (including transparency, consent, data sharing, and subsequent processing) is consistent and up to date. This is likely to be resource intensive, and few organisations will be set up for this level of assurance.
    • Transparency:  The ICO guidance reinforces the need for transparency around whether any third parties will store/access information on the user’s device or receive this information, making clear that all third parties providing cookies or receiving data must be named (avoiding ambiguous references to “partners” or “third parties.”), and that specific information must be provided about each, taking into account UK GDPR considerations where personal data is processed. This will be a considerable challenge for complex ecosystems, most notably in the context of online advertising (albeit this has been a known challenge for some time).
    • Consent Ecosystem: The guidance makes very clear that a process must be in place for passing on when a user withdraws their consent. In practice, the entity collecting the consent is responsible for informing third parties when consent is no longer valid. This is crucial but challenging to comply with, and is again perhaps most relevant in the context of online advertising. 
    • Subsequent Processing: as it has done in the past, the ICO continues to strongly suggests that any subsequent processing of personal data obtained via storage/access technologies on the basis of consent should also be based on consent, going as far as to suggest that reliance on an alternative lawful basis (e.g. legitimate interest) may invalidate any initial consent received.

    Conclusion

    As device fingerprinting and other technologies evolve, it is crucial for organisations to stay informed and ensure compliance with the latest guidance and consider that there may be nuance between regulation in EU / UK.

    The ICO’s Draft Guidance provides helpful clarity on existing rules in the UK, including detailed examples of how to conduct cookie audits, but does not otherwise provide practical guidance on how to overcome many of the operational privacy challenges faced by controllers (such as monitoring changing users and managing consent withdrawals within online advertising ecosystems).

    With increasing regulatory commentary and action in this space, including the ICO’s most recent announcement regarding its focus on reviewing cookie usage on the biggest UK sites, now is the time to take stock of your tracking technologies and ensure compliance!

    The ICO’s Draft Guidance is currently open for consultation, with input sought by 5pm on Friday 14th March 2025. If you have any questions or would like to know more, please get in touch with your usual DLA contact.

    ]]>