Using Biometrics in Strong Customer Authentication under PSD2, in line with GDPR

1        Introduction

Authentication means “verifying the identity of a user, process, or device, often as a prerequisite to allowing access to a system’s resources.”[1] Two-factor authentication(2FA) or multi-factor authentication(MFA) is a well known and common security practice in online environment which requires from the user more than one distinct authentication factors, such as something the user knows, something the user has, and something the user is.[2] The new Payment Services Directive(PSD2)[3] further enhanced this security practice with “strong customer authentication” which is defined as follows:

An authentication based on the use of two or more elements categorised as knowledge (something only the user knows), possession (something only the user possesses) and inherence (something the user is) that are independent, in that the breach of one does not compromise the reliability of the others, and is designed in such a way as to protect the confidentiality of the authentication data.[4]

According to PSD2, payment service providers(PSPs) should apply strong customer authentication, where the payer:

(a) accesses its payment account online;

(b) initiates an electronic payment transaction;

(c) carries out any action through a remote channel which may imply a risk of payment fraud or other abuses.[5]

Strong customer authentication is a very important obligation for PSPs, as non-compliance with the application of this obligation leads the PSPs to bear any financial losses stemming from the lacking of this security practice.[6]  PSPs are also required to have in place adequate security measures to protect the confidentiality and integrity of payment service users’ personalised security credentials.[7]  

Article 98 of the PSD2 delegated to the European Banking Authority(EBA) the duty of preparing the draft regulatory technical standards specifying: (a) the requirements of the strong customer authentication; (b) the exemptions from the application of strong customer authentication; and (c) the requirements with which security measures have to comply, in order to protect the confidentiality and the integrity of the payment service users’ personalised security credentials.[8]  The Regulatory Technical Standards(RTS) has been finalized and published on the Official Journal of the European Union in March 2018  and it will be applied as of 14 September 2019 for all PSPs falling under the scope of PSD2[9], which also means that after this date all PSPs have to apply strong customer authentication according to the provisions addressed under this standard.

This essay is not going to discuss all aspects of strong customer authentication, instead, it will focus on one particular aspect of the strong customer authentication which is the usage of inherence element of the authentication factors, in other words, the application of biometrics in strong customer authentication. The incentive for this focus arose from the fact that usage of biometric data in strong customer authentication might also fall under the scope of GDPR[10] as the biometric data is defined as personal data[11] under this regulation and if it is processed for uniquely identifying a natural person it is also categorised as “sensitive personal data”.[12] Being governed by GDPR, the PSPs are exposed to huge administrative fines in case of non-compliance,[13] therefore using biometrics in strong customer authentication in a privacy-preserving way is really important. On that account, by discussing the concept of  biometrics and biometric data in legal and technical perspectives and by analysing the GDPR’s position for processing of biometric data in the following sections, this essay aims to identify the GDPR’s requirements for this processing and to put forward an optimum solution for using biometrics in strong customer authentication in a GDPR and PSD2 compliant way.   

2        Biometrics and Biometric Data in Technical and Legal Respects

The reference text on the harmonization of biometric vocabulary[14] is the ISO/IEC 2382-37 document.[15] It aims “to provide a systematic description of the concepts in the subject field of biometrics and to clarify the use of the terms in this subject field.”[16] However, the terms defined in this document is really intermingled and complicated and in order to clarify the technical grounds and see the whole picture some of the basic terms should be explained.

According to this document, biometrics is defined as “automated recognition of individuals based on their biological and behavioural characteristics”.[17]  Biometrics is also indicated as the synonym of biometric recognition[18] which is stated to encompass biometric identification[19] and biometric verification[20].

Biometric data, on the other hand, is defined distinctively from biometrics as: “biometric sample or aggregation of biometric samples at any stage of processing, e.g. biometric reference, biometric probe, biometric feature or biometric property”.[21]

The relationship between these terms can be summarized in the following figure as follows:

Figure1: The relationship between basic ISO/IEC 2832-37 terminology

On the legal context, it could be said there was not much consciousness about biometrics or biometric data in the data protection law till the early 2000s. Neither Convention 108[22] nor the 95/46 Directive(DPD)[23] featured these terms. After the usage of biometric technologies became prevalent at the beginning of the 2000s, the Article 29 Working Party(WP29)[24] published a working document on biometrics[25] in 2003 and later on brought out the Opinion 3/3012 on developments in biometric technologies[26] in 2012. Both documents stated that biometric data is personal data but neither of them gave any formal definition for it. In fact, the term biometric data was first defined by WP29 in its Opinion 4/2007[27] which was related to the general concept of personal data and not specifically focusing on biometrics or biometric data.[28] 

Furthermore, WP29 used the term biometrics as a synonym of biometric data[29] which had caused terminological confusion and apparently showed that data protection law had not recognised the exact terminology of the ISO/IEC 2382-37 at those times.[30]  As stated above, biometrics should refer to the “automated recognition” which encompasses biometric identification and biometric verification. The difference between these two terms is also explained in the WP29 working document as follows:

The difference between authentication (verification) and identification is important. Authentication answers to the question: Am I the one I pretend to be? The system certifies the identity of the person by processing biometric data which refer to the person who asks and takes a yes/no decision (1:1 comparison). Identification answers to the question: Who am I? The system recognises the individual who asks by distinguishing him from other persons, whose biometric data is also stored. In that case the system takes a 1-of-n decision, and answers that the person who asks is X.[31]

3        Processing of Biometric Data Under GDPR

When the GDPR’s turn to come, Article 14(4) of it defined the biometric data as the “personal data resulting from a specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data[32].”[33] Apart from being related to the physical, physiological or behavioural characteristics of a natural person, this definition of biometric data includes three different important aspects that are worth discussing.

First of all, biometric data is not defined as “any kind of data”. It is explicitly specified under this definition as being “personal data”. As the definition of personal data under Article 2(1) suggests, if the biometric data is a kind of personal data, then it should be related to an identified or identifiable natural person.

Secondly, this personal data(biometric data) should result from specific technical processing. While the GDPR did not specify any further explanation for what should be understood from this specific technical processing, it can be deduced from the definition that this technical processing should allow or confirm the unique identification of an individual.[34] Jasserand explained this technical processing in three steps: Step 1 – Capturing and collection of biometric characteristics(e.g. fingerprint sample); Step 2- Extraction of features and encoding of them in the form biometric template; Step 3- Comparison of the presented biometric characteristic with a previously recorded template or sample[35]

Lastly, the biometric data concerned should allow or confirm the unique identification of the natural person. The identity “allowance” and “confirmation” here imply the biometric identification and the biometric verification respectively. This can be deduced from the Recital 51 of the GDPR, as it states that: “The processing of photographs should not systematically be considered to be processing of special categories of personal data as they are covered by the definition of biometric data only when processed through a specific technical means allowing the unique identification or authentication of a natural person.”(underline added). Since the definition of biometric data uses the phrase of “which allow or confirm the unique identification” , then it could be said that the “authentication” term used in the Recital 51 specifies the “confirming of the unique identification”. That is to say, biometric identification and verification is present in the definition of biometric data through the verbs “allowing” and “confirming”.[36] It should be also noted that, although the terms identity verification and authentication are often used interchangeably[37], the ISO standard opposes this usage[38].

Another novelty introduced by the GDPR for biometric data is that the processing of biometric data is now subject to a different regime if that processing is used for uniquely identifying a natural person. This new regime is called processing of special categories of personal data formerly known as “processing of sensitive data”[39].  Article 9(1) of the GDPR addressed this as follows:

Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited. (underline added)

As a result of this provision, the processing of biometric data for the purpose of uniquely identifying a natural person, is by default forbidden as the processing of all other special categories of personal data. However the Article 9(2)(a) and 9(2)(g) states that this rule shall not apply if the data subject has given explicit consent to the processing of those personal data for one or more specified purposes or processing is necessary for substantial public interest on the basis of Union or Member State law.

It should be also noted that according to Article 9(1) only the biometric data that is processed for the purpose of uniquely identifying a natural person is deemed sensitive. That is to say as the biometric data itself is by default defined as personal data under Article 4(14) of the GDPR, it becomes sensitive data only when it is used for identification, not verification.  Jasserand argues that:

Biometric data resulting from both biometric identification (establishment of the identity) and identity verification should qualify as sensitive data, provided they relate to an identified individual. Still, a doubt persists because of the ambiguous wording of Recital 51 GDPR. If ‘allowing the unique identification’ refers to the biometric identification function and ‘allowing the authentication’ means ‘identity verification’, biometric data used for identity verification (such as passport/ID verification) would be excluded from the scope of sensitive data.[40]

However, Kindt seems more convinced on her argument as she states that “the processing for verification purposes, both technically different but also generally different from identification, may hence in our opinion not fall under this general prohibition”.[41] Although it can be said that the Article 4(14) definition and Recital 51 wording lacks preciseness, Kindt’s stance seems more reasonable as WP29[42] clearly made distinct definitions for identification and authentications and Recital 51 uses both wording separately. Therefore “unique identification or purpose of uniquely identifying” should not include the verification process.

Another principle introduced by the GDPR for processing of biometric data is the requirement of data protection impact assessment(DPIA). Article 35(3)(b) requires that if there is a “processing on a large scale of special categories of data referred to in Article 9(1)”[43] then a DPIA should be conducted. Since biometric data processed for uniquely identifying a natural person is mentioned as sensitive data under Article 9(1), this kind of processing of biometric data requires a DPIA. Furthermore Article 35(1) of the GDPR requires that “where a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out” a DPIA. Kindt argued that even the biometric data do not fall under the scope of Article 9(1) of the GDPR(when it is not used for identification purposes), a DPIA is required by virtue of Article 35(1) as the biometric technology is a new technology.[44]

Moreover, Article 9(4) of the GDPR allows the Member States to maintain or introduce further conditions, including limitations with regard to the processing of biometric data. That is to say, data controllers who want to process biometric data should also pay attention to the national data protection laws of the Member States, citizens of which are the data subjects of this processing.

According to the information given in this section, it could be argued that processing of biometric data is an onerous process and requires the data controller to pay attention to many important points if it does not want to subject to huge penalties resulting from the non-compliance with GDPR. As Kindt suggested[45], there might be an easier regime for some particular kinds of biometric data processing such as using it only for verification purposes or where the biometric data do not leave or being transmitted outside of a personal device that is controlled and possessed by the data subject. On that account, what a privacy-preserving biometric online authentication might be used by PSPs will be discussed in the next section.

4         A Privacy by Design Strong Authentication Solution

Biometrics does not only enable enhanced security by adding an additional authentication factor in strong authentication but also provide convenience and better user experience as the users need not to memorise secrets when this authentication factor is used. However, as mentioned in the previous sections using biometrics and processing of biometric data may be an onerous process in terms of data protection regulations and the consequences of noncompliance with these regulations can be very serious.

Therefore, data controllers who want to apply strong customer authentication by featuring a biometric authentication factor, should minimise the risk of noncompliance with data protection regulations as far as possible. One of the best methods to do this is to adopt a “privacy by design” approach which is also addressed as an obligation in Article 25 of the GDPR.

One possible and effective way of applying this approach in strong customer authentication under PSD2 is the Fast Identity Online Alliance(FIDO)[46] authentication architecture.  FIDO architecture is basically utilising a multi-factor cryptographic software or hardware[47] which contains an asymmetric private cryptographic key that requires activation through a second authentication factor such as a PIN or a biometric characteristic. In this way, authentication is accomplished by proving possession and control of the private key and the multi-factor cryptographic software or hardware authenticator which are indeed activated by either something you know or something you are.[48] This usage can be resembled using a smart card(something you have) for digital signatures and activating the private key embodied in the smart card by a PIN(something you have) to sign the transaction. But rather than using a smart card FIDO architecture uses software such as a mobile-banking application or hardware such as a smart-phone which can also enable the activation of the private key embodied in it by a PIN(something you know) or by a biometric factor(something you are). That is to say, while biometrics are not required in FIDO implementations, FIDO solutions can use biometrics as one authentication factor.[49]

As stated above, the very core of the FIDO authentication relies on the asymmetric Public Key Cryptography, where the private key is kept secret and stored on the user’s device and only the public key is shared with the online service. That means no credential secrets ever being shared with servers and the likelihood of the credential theft and the related data breach risk is lowered substantially.[50]  

Another strength of FIDO architecture is that it is designed in a way that biometric data can never leave or be exported off the user’s device. The biometric comparison against a database of biometric references is prohibited.[51] This is the most important feature of the FIDO architecture that enables the privacy-by-design approach. The user’s biometric characteristics are verified by performing an offline local match against the biometric template previously stored on the user’s device. Enrolment of a successful biometric verification enables the activation of the private key stored on the device and the signing of authentication challenge which is then sent online to the service provider to complete the authentication.[52]

Moreover, since the biometric data is processed only for verification, not for uniquely identification, this processing of personal data is not subject to the sensitive data regime addressed under Article 9 of the GDPR. The data controller will still be governed by other controller obligations defined under GDPR as the biometric data being processed still deemed as personal data(but not sensitive personal data) which indeed makes the data controller’s life much easier in terms of GDPR compliance.

French Data Protection Authority(CNIL) even carried out this approach to a whole new level by stating that processing of biometric data may fall under the purely personal or household activity exemption and may not be governed by GDPR if the following conditions are met:

  • The biometric template is stored in a fully compartmentalized secure environment within a device that is under the sole control and private use of the individual user;
  • That secure environment should prevent the biometric data itself from being accessible outside and the biometric template should never leave the device;
  • The biometric template is stored in the device should be in encrypted form using a cryptographic algorithm and key management in accordance with the state of the art;
  • The service or application that uses this authentication mode should receive only information on the success or failure of the comparison between the presented biometric characteristic and the template(which means it only conducts verification based comparison);
  • The user must have chosen to use biometric authentication although the service or application provider has offered an alternative authentication mode to biometrics such as entering a code(which means if the biometric authentication is the only option for the user this exemption does not apply)[53]

As can be understood from this approach, CNIL is completely describing the FIDO way of using biometrics without explicitly referring it. However, one particularly important aspect of CNIL’s decision is that it indicates other national data protection authorities may soon propose similar additional rules for the usage of biometrics in authentication.

5        Conclusion

Strong customer authentication(SCA) is one of the obligations that the PSD2 imposes on payment service providers(PSPs). PSPs who wants to use the biometrics as an authentication factor for SCA should also pay attention to other regulatory obligations which originate from the GDPR as it deems biometric data as personal data when used for the unique identification or verification of a natural person.

Additionally, GDPR treats the biometric data as “sensitive data”  when it is processed for uniquely identifying an individual. When being treated as sensitive data, biometric data imposes additional obligations over data controllers such as data protection impact assessment(DPIA) and makes them subject to heavier fines in case of non-compliance. However, it should be also noted that GDPR’s definition for biometric data and the condition addressed therein for when the biometric data becomes “sensitive data” lacks preciseness and may need further clarification.

Nevertheless, the strict obligations of the GDPR should not inhibit the use of technologies like biometrics which indeed enhances the security and at the same time facilitates the user experience when utilised as an authentication factor online. One solution might be the FIDO authentication method as it both respects privacy and utilises biometrics. It seems like this method will be widely used in the near future for strong authentication encompassing biometrics, as countries like France started encouraging this method by entitling exemptions from the obligations of GDPR if this kind of method is used.

This is an obvious consequence of adopting a privacy by design approach. Cavoukian points out this as a basic privacy by design principle that there should be always a positive sum-not a zero sum when talking about privacy. [54]  That is to say, privacy should not be seen as an obstacle when security or new technologies are concerned, because privacy by design aims to make them possible at the same time.  

BIBLIOGRAPHY

‘Article 29 Data Protection Working Party, Opinion 3/2012 on Developments in Biometric Technologies (00720/12/EN WP193 2012)’

‘Article 29 Data Protection Working Party, Opinion 4/2007 on the Concept of Personal Data (WP136 2007)’

‘Article 29 Data Protection Working Party, Working Document on Biometrics (12168/02/EN WP 80 2003)’

‘Biometrics in Personal Smartphones: Application of the Data Protection Framework | CNIL’ <https://www.cnil.fr/fr/biometrie-dans-les-smartphones-des-particuliers-application-du-cadre-de-protection-des-donnees> accessed 20 February 2019

Cavoukian A, ‘Privacy by Design – The 7 Foundational Principles’ <https://iapp.org/resources/article/privacy-by-design-the-7-foundational-principles/> accessed 3 March 2019

‘FIDO Alliance – Open Authentication Standards More Secure than Passwords’ (FIDO Alliance) <https://fidoalliance.org/> accessed 3 March 2019

‘FIDO Authentication And The General Data Protection Regulation’ <https://fidoalliance.org/wp-content/uploads/FIDO_Authentication_and_GDPR_White_Paper_May2018-1.pdf>

Grassi PA and others, ‘Digital Identity Guidelines: Authentication and Lifecycle Management’ (National Institute of Standards and Technology 2017) NIST SP 800-63b <https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-63b.pdf> accessed 20 February 2019

Grassi PA, Garcia ME and Fenton JL, ‘Digital Identity Guidelines: Revision 3’ (National Institute of Standards and Technology 2017) NIST SP 800-63-3 <https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-63-3.pdf> accessed 20 February 2019

‘International Standard, Information Technology Vocabulary, Part 37:Biometrics (ISO/IEC 2382-37, Second Edition, ISO/IEC 2017)’

Jasserand C, ‘Legal Nature of Biometric Data: From Generic Personal Data to Sensitive Data’ (2016) 2 European Data Protection Law Review (EDPL) 297

Jasserand CA, ‘Avoiding Terminological Confusion between the Notions of “Biometrics” and “Biometric Data”: An Investigation into the Meanings of the Terms from a European Data Protection and a Scientific Perspective’ (2016) 6 International Data Privacy Law 63

Kindt EJ, ‘Having Yes, Using No? About the New Legal Regime for Biometric Data’ (2018) 34 Computer Law & Security Review 523

Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, ETS, No 108 1981

Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data 1995 [31995L0046]

Commission Delegated Regulation (EU) 2018/389 of 27 November 2017 supplementing Directive (EU) 2015/2366 of the European Parliament and of the Council with regard to regulatory technical standards for strong customer authentication and common and secure open standards of communication (Text with EEA relevance. ) 2018 (OJ L)

Council Decision 2008/615/JHA of 23 June 2008 on the stepping up of cross-border cooperation, particularly in combating terrorism and cross-border crime 2008 (OJ L)

Directive (EU) 2015/2366 of the European Parliament and of the Council of 25 November 2015 on payment services in the internal market, amending Directives 2002/65/EC, 2009/110/EC and 2013/36/EU and Regulation (EU) No 1093/2010, and repealing Directive 2007/64/EC (Text with EEA relevance) 2015 (337)

Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance) 2016 (OJ L)


[1] Paul A Grassi, Michael E Garcia and James L Fenton, ‘Digital Identity Guidelines: Revision 3’ (National Institute of Standards and Technology 2017) NIST SP 800-63-3 41 <https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-63-3.pdf> accessed 20 February 2019.

[2] ibid 49.

[3] Directive (EU) 2015/2366 of the European Parliament and of the Council of 25 November 2015 on payment services in the internal market, amending Directives 2002/65/EC, 2009/110/EC and 2013/36/EU and Regulation (EU) No 1093/2010, and repealing Directive 2007/64/EC (Text with EEA relevance) 2015 (337).

[4] ibid Article 4(30).

[5] ibid Article 97(1).

[6] ibid Article 74(2), 92(1).

[7] ibid Article 97(3).

[8] ibid Article 98.

[9] Commission Delegated Regulation (EU) 2018/389 of 27 November 2017 supplementing Directive (EU) 2015/2366 of the European Parliament and of the Council with regard to regulatory technical standards for strong customer authentication and common and secure open standards of communication (Text with EEA relevance. ) 2018 (OJ L) s Article 38(2).

[10] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance) 2016 (OJ L).

[11] ibid Article 4(14).

[12] ibid Article 9.

[13] ibid Article 83.

[14] Catherine A Jasserand, ‘Avoiding Terminological Confusion between the Notions of “Biometrics” and “Biometric Data”: An Investigation into the Meanings of the Terms from a European Data Protection and a Scientific Perspective’ (2016) 6 International Data Privacy Law 63, 68.

[15] ‘International Standard, Information Technology Vocabulary, Part 37:Biometrics (ISO/IEC 2382-37, Second Edition, ISO/IEC 2017)’.

[16] ibid V.

[17] ibid 3.1.3.

[18] ibid.

[19] ibid 3.8.2.

[20] ibid 3.8.3.

[21] ibid 3.3.6.

[22] Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, ETS, No 108 1981.

[23] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data 1995 [31995L0046].

[24] ibid Article 29-30.

[25] ‘Article 29 Data Protection Working Party, Working Document on Biometrics (12168/02/EN WP 80 2003)’.

[26] ‘Article 29 Data Protection Working Party, Opinion 3/2012 on Developments in Biometric Technologies (00720/12/EN WP193 2012)’.

[27] ‘Article 29 Data Protection Working Party, Opinion 4/2007 on the Concept of Personal Data (WP136 2007)’.

[28] Jasserand (n 14) 67.

[29] ‘Article 29 Data Protection Working Party, Working Document on Biometrics (12168/02/EN WP 80 2003)’ (n 25) 2; ‘Article 29 Data Protection Working Party, Opinion 3/2012 on Developments in Biometric Technologies (00720/12/EN WP193 2012)’ (n 26) 6, 11, 14.

[30] Jasserand (n 14) 70.

[31] ‘Article 29 Data Protection Working Party, Working Document on Biometrics (12168/02/EN WP 80 2003)’ (n 25) 3, footnote 4.

[32] ‘Dactyloscopic data mean fingerprint images, images of fingerprint latents, palm prints, palm print latents and templates of such images (coded minutiae), when they are stored and dealt with in an automated database.’ Council Decision 2008/615/JHA of 23 June 2008 on the stepping up of cross-border cooperation, particularly in combating terrorism and cross-border crime 2008 (OJ L) s Article 2(i).

[33] GDPR s Article 4(14).

[34] Catherine Jasserand, ‘Legal Nature of Biometric Data: From Generic Personal Data to Sensitive Data’ (2016) 2 European Data Protection Law Review (EDPL) 297, 302.

[35] ibid 302–303.

[36] ibid 305.

[37] ‘Biometric systems are applications of biometric technologies, which allow the automatic identification, and/or authentication/verification of a person’ ‘Article 29 Data Protection Working Party, Working Document on Biometrics (12168/02/EN WP 80 2003)’ (n 25) 3; Section 2 Definitions – ‘Biometric verification/authentication’ ‘Article 29 Data Protection Working Party, Opinion 3/2012 on Developments in Biometric Technologies (00720/12/EN WP193 2012)’ (n 26) 6.

[38] Note 1 to entry: Use of the term ‘authentication’ as a substitute for biometric verification is deprecated. ‘International Standard, Information Technology Vocabulary, Part 37:Biometrics (ISO/IEC 2382-37, Second Edition, ISO/IEC 2017)’ (n 15) s 3.8.3.

[39] Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (n 23) s Article 8; GDPR s Recital 10.

[40] Jasserand (n 34) 309.

[41] EJ Kindt, ‘Having Yes, Using No? About the New Legal Regime for Biometric Data’ (2018) 34 Computer Law & Security Review 523, 526–527.

[42] ‘Article 29 Data Protection Working Party, Working Document on Biometrics (12168/02/EN WP 80 2003)’ (n 25) 3, footnote 4.

[43] GDPR s Article 35(3)(b).

[44] Kindt (n 41) 535.

[45] ibid.

[46] ‘FIDO Alliance – Open Authentication Standards More Secure than Passwords’ (FIDO Alliance) <https://fidoalliance.org/> accessed 3 March 2019.

[47] Paul A Grassi and others, ‘Digital Identity Guidelines: Authentication and Lifecycle Management’ (National Institute of Standards and Technology 2017) NIST SP 800-63b s 5.1.8 and 5.1.9 <https://nvlpubs.nist.gov/nistpubs/SpecialPublications/NIST.SP.800-63b.pdf> accessed 20 February 2019.

[48] ibid.

[49] ‘FIDO Authentication And The General Data Protection Regulation’ 3 <https://fidoalliance.org/wp-content/uploads/FIDO_Authentication_and_GDPR_White_Paper_May2018-1.pdf>.

[50] ibid 8.

[51] ibid 8–9.

[52] ibid 10.

[53] ‘Biometrics in Personal Smartphones: Application of the Data Protection Framework | CNIL’ <https://www.cnil.fr/fr/biometrie-dans-les-smartphones-des-particuliers-application-du-cadre-de-protection-des-donnees> accessed 20 February 2019.

[54] Ann Cavoukian, ‘Privacy by Design – The 7 Foundational Principles’ <https://iapp.org/resources/article/privacy-by-design-the-7-foundational-principles/> accessed 3 March 2019.