Jules Toynton and Isla Neil | Privacy Matters | DLA Piper Data Protection and Privacy | DLA Piper https://privacymatters.dlapiper.com/author/ineil/ DLA Piper's Global Privacy and Data Protection Resource Mon, 03 Feb 2025 09:17:17 +0000 en-US hourly 1 https://wordpress.org/?v=6.8&lxb_maple_bar_source=lxb_maple_bar_source https://privacyblog.dlapiperblogs.com/wp-content/uploads/sites/32/2023/07/cropped-Favicon_512x512-32x32.gif Jules Toynton and Isla Neil | Privacy Matters | DLA Piper Data Protection and Privacy | DLA Piper https://privacymatters.dlapiper.com/author/ineil/ 32 32 UK: Google’s U-Turn on Device Fingerprinting: ICO’s Response and Subsequent Guidance https://privacymatters.dlapiper.com/2025/01/googles-u-turn-on-device-fingerprinting-icos-response-and-subsequent-guidance/ Thu, 30 Jan 2025 18:25:52 +0000 https://privacymatters.dlapiper.com/?p=7540 Continue Reading]]> In a December, the Information Commissioner’s Office (ICO) responded to Google’s decision to lift a prohibition on device fingerprinting (which involves collecting and combining information about a device’s software and hardware, for the purpose of identifying the device) for organisations using its advertising products, effective from 16 February 2025 (see an overview of Google’s new Ads Platforms policies here). This follows Google’s previous decision in July 2024 to keep third party cookies.

In its response, the ICO criticized Google’s decision to permit device fingerprinting for advertising purposes as “irresponsible” and emphasised that device fingerprinting:

  1. Requires Consent: device fingerprinting enables devices to be identified even where cookies are blocked or the location is disguised, hence its common use for fraud prevention purposes, but the ICO reinforced that it is subject to the usual consent requirements.
  2. Reduces User Control: Despite various browsers now offering “enhanced” tracking protection, the ICO stated that device fingerprinting is not a fair means of tracking users online as it diminishes people’s choice and control over how their information is collected.

This statement echoes concerns previously voiced by Google who had stated that device fingerprinting “subverts user choice and is wrong”.

With the potential for fingerprinting to replace the long-debated third-party (3P) cookie functionality, this statement forms part of a shift in regulatory focus to technologies beyond cookies. Various technologies have recently received greater scrutiny, both in the ICO’s Draft Guidance on the use of storage and access technologies | ICO (“ICO’s Draft Guidance“) – interestingly issued in December 2024 to coincide with the Google update – and the European Data Protection Board (EDPB) Guidelines 2/2023 on Technical Scope of Art. 5(3) of ePrivacy Directive.

ICO Draft Guidance: Key Takeaways

The ICO’s Draft Guidance explores the practical application of the Privacy and Electronic Communications Regulations (PECR) requirement that consent must be obtained by the user for any storage or access of information on/from a device (‘terminal equipment’), unless such storage/access is strictly necessary for the purposes of a communication or to provide a service requested by the user.

In particular, the Draft Guidance addresses the following areas which are explored further in their respective sections below:

Technologies

The ICO’s Draft Guidance looks at how and why the rules relating to storage and access of device information apply to various types of technologies used in web browsers, mobile apps or connected devices, namely: Cookies; Tracking Pixels, Link Decoration and Navigational Tracking, Web Storage, Scripts and tags, and Fingerprinting techniques. The technologies focused on by the ICO overlap to a large extent with those examples used by the EDPB in their guidelines. However, taking the analysis on pixels as an example, the EDPB suggests that any distribution of tracking links/pixels to the user’s device (whether via websites, emails, or text messaging systems) is subject to Regulation 5(3) of the ePrivacy Directive as it constitutes ‘storage’ even if only temporarily via client-side caching.  The ICO’s guidance is less clear, suggesting that tracking pixels are only subject to Regulation 6 Privacy and Electronic Communications (EC Directive) Regulations 2003 (PECR) when they store information on the user’s device. This might imply a less expansive view compared to the EDPB, highlighting the importance of remaining alive to jurisdictional nuances for any global tracking campaigns.

Detailed Consent Requirements

The ICO reiterates that for a PECR consent to be valid, it must meet UK GDPR standards (freely given, specific, informed and unambiguous statement of the individual’s wishes indicated by a clear affirmative action).

    The ICO highlights the fact that the consent must be provided by the data subject where personal data is processed (this contrasts with the PECR user/subscriber consent requirement) – this tension is an existing issue, but quite how the party collecting the cookie consent for personal data processed via cookies (or a similar technology) is supposed to know whether the user of a device has changed, without either requiring re-consent or user identification on each visit (or carrying out background identification using user fingerprinting or similar, which means more data processing and may be intrusive) is unclear.

    In line with recent ICO statements in relation to the lack of ‘reject all’ options, the ICO emphasises that subscribers/users must be able to refuse the use of storage and access technologies as easily as they can consent. Additional points of interest for controllers include:

    • That users must have control over any use of non-essential storage and access technologies. While this could, on a conservative reading, be interpreted as needing US-style granular per-cookie consent, the examples provided suggest high-level consent mechanisms expressed per category (e.g., analytics, social media tracking, marketing) are still acceptable;
    • Clarification that you must specifically name any third parties whose technologies you are requesting consent to (this information can be provided in a layered fashion provided this is very clear). However, if controls are not required at an individual cookie level, which seems to be the case, then this becomes less meaningful for data subjects who cannot act on this additional information as they only have the choice of rejecting all storage and access technologies for each purpose category (e.g. all analytics cookies/technologies) rather than a relevant third party; and
    • Clarification that users must be provided with controls over any use of storage and access technologies for non-essential purposes (albeit this was arguably already required in order to facilitate withdrawal of consent/changing of preferences on an ongoing basis).

    Exemptions to consent: Strictly Necessary

    Leaving aside technologies necessary for communications, the ICO emphasises that the “strictly necessary” exemption applies when the purpose of the storage or access is essential to provide the service the subscriber or user requests. Helpfully, the ICO Draft Guidance clarifies that technologies used to comply with applicable law e.g. meeting security requirements, can be regarded as “strictly necessary”, such that no consent is required. This will not apply if there are other ways that you can comply with this legislation without using cookies or similar technologies.

    Other examples of activities likely to meet the exemption include: (i) ensuring the security of terminal equipment; (ii) preventing or detecting fraud; (iii) preventing or detecting technical faults; (iv) authenticating the subscriber or user; and (v) recording information or selections made on an online service.

    One area of ambiguity remains in relation to fraud prevention and detection. In the financial services sector, websites/apps often use third-party fingerprinting for fraud detection (in order to meet legal obligations to ensure the security of their services).  ‘Preventing or detecting fraud’ is listed as an example of an activity likely to meet the exemption, whilst third party fingerprinting for fraud prevention is used by the ICO as an example of an activity subject to Article 6 PECR, with the implication that consent is needed (albeit this is not stated). However, the DUA Bill (if passed in its current form) provides some helpful clarity here, as it states that use of such technologies should be regarded as “strictly necessary” where used to protect information, for security purposes, to prevent or detect fraud or technical faults, to facilitate automatic authentication, or to maintain a record of selections made by the user.

    Interestingly, the guidance suggests that the use of social media plugins/tools by logged-in users might be strictly necessary, though this does not extend to logged-out users, users who are not a member of that network, or any associated tracking.

    Governance and compliance

    A number of the ICO’s clarifications are likely to impact day to day resourcing and operations for any organisation using material numbers of storage and access technologies:

    • Governance: the ICO emphasises what it expects in respect of governance of storage and access requirements, including an audit checklist, emphasising the need to regularly audit the use of such technologies and ensure that the rest of the consent ecosystem (including transparency, consent, data sharing, and subsequent processing) is consistent and up to date. This is likely to be resource intensive, and few organisations will be set up for this level of assurance.
    • Transparency:  The ICO guidance reinforces the need for transparency around whether any third parties will store/access information on the user’s device or receive this information, making clear that all third parties providing cookies or receiving data must be named (avoiding ambiguous references to “partners” or “third parties.”), and that specific information must be provided about each, taking into account UK GDPR considerations where personal data is processed. This will be a considerable challenge for complex ecosystems, most notably in the context of online advertising (albeit this has been a known challenge for some time).
    • Consent Ecosystem: The guidance makes very clear that a process must be in place for passing on when a user withdraws their consent. In practice, the entity collecting the consent is responsible for informing third parties when consent is no longer valid. This is crucial but challenging to comply with, and is again perhaps most relevant in the context of online advertising. 
    • Subsequent Processing: as it has done in the past, the ICO continues to strongly suggests that any subsequent processing of personal data obtained via storage/access technologies on the basis of consent should also be based on consent, going as far as to suggest that reliance on an alternative lawful basis (e.g. legitimate interest) may invalidate any initial consent received.

    Conclusion

    As device fingerprinting and other technologies evolve, it is crucial for organisations to stay informed and ensure compliance with the latest guidance and consider that there may be nuance between regulation in EU / UK.

    The ICO’s Draft Guidance provides helpful clarity on existing rules in the UK, including detailed examples of how to conduct cookie audits, but does not otherwise provide practical guidance on how to overcome many of the operational privacy challenges faced by controllers (such as monitoring changing users and managing consent withdrawals within online advertising ecosystems).

    With increasing regulatory commentary and action in this space, including the ICO’s most recent announcement regarding its focus on reviewing cookie usage on the biggest UK sites, now is the time to take stock of your tracking technologies and ensure compliance!

    The ICO’s Draft Guidance is currently open for consultation, with input sought by 5pm on Friday 14th March 2025. If you have any questions or would like to know more, please get in touch with your usual DLA contact.

    ]]>
    UK: Enforcement Against the Use of Biometrics in the Workplace https://privacymatters.dlapiper.com/2024/02/uk-enforcement-against-the-use-of-biometrics-in-the-workplace/ Thu, 29 Feb 2024 09:29:48 +0000 https://privacymatters.dlapiper.com/?p=7238 Continue Reading]]> The ICO has issued an enforcement notice which provides valuable insights into its approach to the use of biometrics in the workplace, and the lawfulness of employee monitoring activities more broadly.

    On 23 February 2024, the Information Commissioner’s Office (“ICO”) ordered Serco Leisure Operating Limited (“Serco”), an operator of leisure facilities, to stop using facial recognition technology and fingerprint scanning (“biometric data”) to monitor employee attendance and subsequent payment for their time. Serco operates the leisure facilities on behalf of leisure trusts, some of which were also issued enforcement notices, as joint controllers.

    Background

    Serco introduced biometric technology in May 2017 within 38 Serco-operated leisure facilities. Serco considered that previous systems for monitoring attendance were prone to abuse, on the basis that manual sign-in sheets were prone to human error. Additionally, Serco found that manual sheets were abused by a minority of employees and further that ID cards were used inappropriately by employees. As a result, Serco considered that using biometric technology was the best way to prevent these abuses.

    To support this assessment, Serco produced a data protection impact assessment (“DPIA”) and legitimate interest assessment (“LIA”). Within these documents, Serco identified the lawful bases for the processing of biometric data as Articles 6(1)(b) and (f) and the relevant condition for special category personal data as Article 9(2)(b) of the UK General Data Protection Regulation (“UK GDPR”).

    Article 6(1)(b) was selected on the basis that Serco considered that operating the attendance monitoring system was necessary for compliance with the employees’ employment contracts. Article 6(1)(f) was selected in connection with Serco’s legitimate interests, which presumably related to the wider aims of the attendance monitoring system and the move to use biometric data, outlined above.

    Serco selected Article 9(2)(b) on the basis that it considered that this processing was required for compliance with applicable laws relating to employment, social security and social protection. In particular, Serco considered that it needed to process attendance data to comply with a number of regulations, such as working time regulations, national living wage, right to work and tax/accounting regulations.

    The contravention

    Despite the above, the ICO believed Serco, as a controller, had failed to establish an appropriate lawful basis and special category personal data processing condition for the processing of biometric data. Serco had therefore contravened Articles 5(1)(a), 6 and 9 of the UK GDPR. The ICO had previously served Serco with a Preliminary Enforcement Notice in November 2023, giving Serco the opportunity to provide written representations, which the ICO considered in issuing the Enforcement Notice of 23 February 2024.

    The ICO gave Serco three months from the date of the Enforcement Notice, to:

    • Cease all processing of biometric data for the purpose of employment attendance checks from the facilities, and not implement biometric technology at any further facilities; and
    • Destroy all biometric data and all other personal and special category data that Serco is not legally obliged to retain.

    Key takeaways from the Enforcement Notice

    1. Processing must be necessary in order to rely on most lawful bases and special category personal data processing conditions.

    The ICO emphasised that the processing of biometric data cannot be considered as “necessary” when less intrusive means could be used to achieve the same purpose.

    It is not ordinarily necessary for an employer to process biometric data in order to operate an attendance monitoring system. It is of course necessary for employee attendance data to be processed, but this would not usually extend to biometric data.

    It could perhaps be possible to argue that it is necessary to use biometric data in connection with attendance monitoring in an extreme case, but this would need to be based on specific circumstances. In this case, although Serco had considered that other less intrusive methods were subject to abuse, this consideration was not sufficient to justify use of biometric data on its own.

    The ICO’s position was that Serco had not provided enough information to support its argument that eliminating abuse of the attendance monitoring system was a necessity, rather than simply a further benefit to Serco. There was a lack of evidence of consideration of alternative means of handling such abuse e.g. taking disciplinary action against the individuals responsible. The processing of biometric data was therefore not a targeted and proportionate way of achieving the purpose of verifying attendance.

    1. An appropriate balancing test must be conducted when relying on legitimate interest.

    The ICO considered that in relying on its legitimate interests as a lawful basis, Serco did not give appropriate weight to the intrusive nature of biometric processing and the risks to the employees. Failure to give such appropriate weight meant that Serco could not rely on Article 6(1)(f).

    Additionally, the ICO found that legitimate interests would not be regarded as an appropriate lawful basis where:

    1. The processing has a substantial privacy impact. In this instance, it was the regular and systematic processing of employee biometric data, which would entail a regular intrusion into their privacy over which they have no, or minimal control.
    1. Employees are not given clear information about how they could object or alternative methods of monitoring that did not involve intrusive processing. The fairness of processing, the availability and ease with which to exercise data subject rights and the provision of clear information are factors that should be taken into account when relying on legitimate interests and conducting an appropriate balancing test. The ICO highlighted that Serco had failed to process data fairly by not bringing the alternative mechanisms to the employees’ attention, even when an employee complained. There was also failure to process fairly as employees were not informed on how they could object to the processing.
    1. There is an imbalance of power between the employer and employees, such that employees may not have felt able to object (without detriment) even if they have been informed that they could.
    1. A specific legal obligation must be identified from the onset of processing in order to rely on Article 9(2)(b) UK GDPR.

    In this instance, Serco had initially failed (including in its DPIA), to identify the specific obligation or right conferred by law on which it relied in reference to Article 9(2)(b) of the UK GDPR.

    In this case, it may be that this omission was due to the fact that there is no such obligation or right conferred by law. Whilst there are legal obligations to record time and attendance data, health and safety obligations and requirements to manage the employment relationship, there are no specific legal obligations that would necessitate the processing of biometric data in connection with attendance monitoring.

    In cases where there is a specific legal obligation or right conferred to process special category data (for example, in respect of the employer’s duty to make reasonable adjustments or to manage sickness at work), the ICO emphasised that it is not sufficient to simply select Article 9(2)(b) of the UK GDPR as the basis for processing. The controller must identify the specific obligation or right conferred by law and must have done so from the outset – before the processing of special category personal data commences.

    It is also worth noting that, despite having conducted a DPIA and LIA, Serco could also not rely on this condition because Serco did not produce an appropriate policy document as required by Sch. 1 Para 1(1)(b) of the Data Protection Act 2018 (“DPA”) and had failed to demonstrate the necessity of processing biometric data (as referred to above).

    4. The ICO will take account of infringement amplifiers.

    In addition to biometric data being one that carries greater risk of harm, the length of time of processing without an appropriate lawfu-l basis (since 2017) and the number of data subjects involved (2,283), were also factors that the ICO considered as increasing the seriousness of the infringement.

    Summary and conclusion

    This decision does allow for the possibility to argue that use of biometric data is necessary, targeted and proportionate for attendance monitoring. However, as mentioned above, this would very much depend on the circumstances and the decision shows that this is likely to be the exception rather than the rule.

    If an employer sought to rely on its legal obligations as a lawful basis for the processing, the controller would need to be in a position to show that the processing was now necessary to comply with these requirements. This would require it to provide evidence of widespread abuse and failure of other less intrusive methods. However even in these circumstances the employer would still need to consider fairness and proportionality in the operation of the system, as explained in this post.

    It is possible for an employer to consider using employee consent as a basis under Article 9(2)(a) for processing biometric data in an attendance management system, given the limitations of Article 9(2)(b). However, as noted above, the imbalance of power in the employment relationship will act against the employer in relying on this basis unless there is a genuine ability for the employee to refuse using the system. In such a case, the operation of an alternative option to biometric data will be critical.

    If an employer did wish to adopt biometric data processing for attendance monitoring systems, following this decision, we recommend that such an employer includes the following steps in the context of undertaking its DPIA, LIA and implementation processes:

    • Identify the appropriate lawful basis for the processing activity.
    • If the lawful basis relates to a specific obligation or right conferred by law, identify and document that law.
    • Consider whether the processing could be said to be necessary for the identified lawful basis and gather supporting evidence for this assessment, where relevant.
    • Provide employees with clear information regarding the processing, including information regarding data retention and use, as well as clear information regarding their right to object. This must be provided in advance of the system being implemented.
    • Undertake a full consideration of the fairness and proportionality of the processing, acknowledging that processing biometric data is extremely intrusive and carries significant privacy impacts for employees.
    • Provide employees an alternative option to participate in the attendance monitoring system should they object to the use of their biometric data and ensure that this is used in practice (meaning that there must always be another way to monitor attendance alongside the biometric data).
    • Ensure that an appropriate policy document is implemented, if relaying on a lawful basis under the UK GDPR that mandates this (e.g. Article 9(2)(b)).
    ]]>