398
Views
0
CrossRef citations to date
0
Altmetric
Articles

‘Look at the camera and say cheese’: the existing European legal framework for facial recognition technology in criminal investigations

Pages 1-20 | Published online: 25 Jul 2023
 

ABSTRACT

Facial recognition technology represents the state of the art in modern criminal investigations. However, the legal position regarding its use in Europe remains unclear. This ambiguity is not because of a total lack of applicable norms. Rather, it occurs because the existing standards create a kind of legal patchwork distributed among European Union primary and secondary laws and norms from the Council of Europe. Although the recent European Commission proposal on a legal framework for artificial intelligence has clarified some issues, it has also added more complexity to an already unclear domain. Before launching an effort to build a specific set of norms for the use of FRT in criminal investigations, it is crucial to survey the norms already in place to avoid contradictions and overlaps.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 The paper uses the concept of facial recognition as defined by the Article 29 Working Party (WP29): ‘automatic processing of digital images which contain the faces of individuals for identification, authentication/verify cation or categorisation of those individuals’ (WP29, ‘Opinion 02/2012 on Facial Recognition in Online and Mobile Services’, 00727/12/EN WP 192’ (22 March 2012) 2 <https://www.pdpjournals.com/docs/87997.pdf> accessed 4 May 2023).

2 Court of Justice of the European Union, ‘Video-Surveillance Policy’ (2015) <https://curia.europa.eu/jcms/upload/docs/application/pdf/2016-07/videosurveillance_policy_juin2015_with_annexes.pdf> accessed 13 April 2023).

3 This relates to the much higher level of threat to human rights from the use of FRT, D Dushi, ‘The Use of Facial Recognition Technology in EU Law Enforcement: Fundamental Rights Implications’ (Global Campus South East Europe, 2020) 4 <https://repository.gchumanrights.org/handle/20.500.11825/1625> accessed 10 May 2023.

4 Cf. VL Raposo, ‘(Do Not) Remember My Face: Uses of Facial Recognition Technology in Light of the General Data Protection Regulation’ (2022) 32(1) Information & Communications Technology Law 45, doi:10.1080/13600834.2022.2054076.

5 This paper will not provide suggestions for a future legal framework for the use of FRT in law enforcement. For this see VL Raposo, ‘The Use of Facial Recognition Technology by Law Enforcement in Europe: A Non-Orwellian Draft Proposal’ (2022) Eur J Crim Pol Res. 1, doi:10.1007/s10610-022-09512-y.

6 Dushi (n 3) 3; EDRi ‘Facial Recognition and Fundamental Rights 101’ (European Digital Rights, 2019), <https://edri.org/facial-recognition-and-fundamental-rights-101>, accessed 14 May 2023; European Union Agency for Fundamental Rights, ‘Facial Recognition Technology: Fundamental Rights Considerations in the Context pf Law Enforcement’ (21 November 2019) 7–8 <https://fra.europa.eu/en/publication/2019/facial-recognition-technology-fundamental-rights-considerations-context-law>, accessed 4 May 2023; J Brennan, ‘Facial Recognition: Defining Terms to Clarify Challenges (Ada Lovelace Institute, 2019) <https://www.adalovelaceinstitute.org/blog/facial-recognition-defining-terms-to-clarify-challenges/> accessed 3 March 2023; Raposo (n 4).

7 Article 29 Data Protection Working Party, ‘Opinion on “Developments in Biometric Technologies”’ (2012) 6 <https://ec.europa.eu/justice/article-29/documentation/opinion-recommendation/files/2012/wp193_en.pdf> accessed 3 May 2023.

8 Article 19, ‘Emotional Entanglement: China’s Emotion Recognition Market and its Implications for Human Rights’ (2021) <https://www.article19.org/wp-content/uploads/2021/01/ER-Tech-China-Report.pdf> accessed 23 May 2023.

9 Cf. Raposo (n 5); European Data Protection Board, ‘Guidelines 05/2022 on the Use of Facial Recognition Technology in the Area of Law Enforcement’ (2022) 7 ff <https://edpb.europa.eu/system/files/2022-05/edpb-guidelines_202205_frtlawenforcement_en_1.pdf> accessed 20 May 2023.

10 This is very common in airports when using biometric passports; see Council Regulation (EC) No 2252/2004 of 13 December 2004 on Standards for Security Features and Biometrics in Passports and Travel Documents Issued By Member States <https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=celex%3A32004R2252> accessed 3 May 2023.

11 J Sanchez del Rio and others, ‘Automated Border Control E-Gates and Facial Recognition Systems’ (2016) 62 Computers & Security 49, doi.org/10.1016/j.cose.2016.07.001.

12 Commission Nationale de l'Informatique et des Libertés (CNIL), ‘Reconnaissance Faciale – Pour Un Debat À la Hauteur des Enjeux’ (2019) 3, <https://www.cnil.fr/fr/reconnaissance-faciale-pour-un-debat-la-hauteur-des-enjeux>, accessed 14 March 2023; B Leong, ‘Facial Recognition and the Future of Privacy: I Always Feel Like … Somebody’s Watching Me’ (2019) 75(3) Bulletin of the Atomic Scientists 109, 110, doi:10.1080/00963402.2019.1604886.

13 M O’Flaherty, ‘Opinions, Facial Recognition Technology and Fundamental Rights’ (2020) 6(2) European Data Protection Law Review 170, 172, doi.org/10.21552/edpl/2020/2/4.

14 Cf. Commission Nationale de l'Informatique et des Libertés (n 12) 3.

15 M Mann and M Smith (‘Automated Facial Recognition Technology: Recent Developments and Approaches to Oversight’ (2017) 40(1) UNSW Law Journal 121, 123) point out the advantages of these databases: they are not invasive, and identification can be done at distance and without the suspect’s knowledge.

16 R Watts, ‘Facial Recognition as a Force for Good’ (2019) Biometric Technology Today, 5. The author gives the example of a pilot experience with FRT in New Delhi, India, which managed to identify 3,000 missing children in the streets of the city in its first four days of use.

17 Such as international world leader meetings, music concerts and sport events. For instance, London police used it at the Nothing Hill Carnival of 2018 (Mayor of London, ‘Notting Hill Carnival and Automated Facial Recognition’ (2018) <https://www.london.gov.uk/questions/2018/1492> accessed 1 May 2023).

18 VL Raposo, ‘Can China’s ‘Standard of Care’ for COVID-19 Be Replicated in Europe?’ (2020) 46 Journal of Medical Ethics 451, doi.org/10.1136/medethics-2020-106210.

19 Also used at European borders to control migration. See European Union Agency for Fundamental Rights (n 6) 13–17.

20 Despite its growth around the world, there are also counter-movements: the largest police body cameras distributer, AXN, announced that it will not use FRT due to the lack of reliability (S Ingber, ‘Major Police Body Camera Manufacturer Rejects Facial Recognition Software’ (NPR, 2019) <https://www.npr.org/2019/06/27/736644485/major-police-body-camera-manufacturer-rejects-facial-recognition-software> accessed 11 April 2023).

21 E Sánchez Nicolás, ‘EU Warned over Fast-Tracking Facial Recognition’ (EU Observer, 2019) <https://euobserver.com/science/146732> accessed 2 February 2023.

22 The UK data protection authority, ICO, recommended that police forces ‘slow down’ the use of these technologies but has not banned them (E Denham, ‘Blog: Live Facial Recognition Technology – Police Forces Need to Slow Down and Justify its Use’ (ICO, 2019) <https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2019/10/live-facial-recognition-technology-police-forces-need-to-slow-down-and-justify-its-use/> accessed 13 May 2023).

23 NewEurope, ‘Sweden Authorises the Use of Facial Recognition Technology by the Police’ (2019) <https://www.neweurope.eu/article/sweden-authorises-the-use-of-facial-recognition-technology-by-the-police/> accessed 2 April 2023.

24 H Samsel, ‘California Becomes Third State to Ban Facial Recognition Software in Police Body Cameras’ (Security Today, 2019) <https://securitytoday.com/articles/2019/10/10/california-to-become-third-state-to-ban-facial-recognition-software-in-police-body-cameras.aspx> accessed 23 March 2023.

25 M Andrejevic and N Selwyn, ‘Facial Recognition Technology and the End of Privacy for Good’ (Monash Lens, 2020), <https://lens.monash.edu/@politics-society/2020/01/23/1379547/facial-recognition-tech-and-the-end-of-privacy>, accessed 20 May 2023; C Haskins, R Mac and A Pequeño IV, ‘Police In At Least 24 Countries Have Used Clearview AI. Find Out Which Ones Here’ (2021) <https://www.buzzfeednews.com/article/ryanmac/clearview-ai-international-search-table> accessed 12 February 2023.

Contestation involved the Swedish police, which had unlawfully processed biometric data for facial recognition, and consequently were fined by the Swedish data protection authority. Cf. European Data Protection Board, ‘Swedish DPA: Police Unlawfully Used Facial Recognition App’ (2021) <https://edpb.europa.eu/news/national-news/2021/swedish-dpa-police-unlawfully-used-facial-recognition-app_en> accessed 3 May 2023.

26 European Data Protection Board, ‘Thirty-First Plenary Session: Establishment of a Taskforce on TikTok. Response to MEPs on use of Clearview AI by Law Enforcement Authorities, Response to ENISA Advisory Group, Response to Open Letter NYOB’ (2020) <https://edpb.europa.eu/news/news/2020/thirty-first-plenary-session-establishment-taskforce-tiktok-response-meps-use_en> accessed 11 May 2023.

27 European Data Protection Board, ‘Facial Recognition: Italian SA Fines Clearview AI EUR 20 Million’ (2022) <https://edpb.europa.eu/news/national-news/2022/facial-recognition-italian-sa-fines-clearview-ai-eur-20-million_en> accessed 12 March 2023.

The British ICO also issued a £7,552,800 fine (ICO, ‘ICO Fines Facial Recognition Database Company Clearview AI Inc More than £7.5m and Orders UK Data to Be Deleted’ (2022) <https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2022/05/ico-fines-facial-recognition-database-company-clearview-ai-inc> accessed 3 April 2023).

28 Interestingly, not all facial recognition is performed using AI. Apart from FR police forces are also using so-called super-recognizers (<https://superrecognisersinternational.com/> accessed 2 March 2023), i.e., people who are particularly good at remembering faces and recognizing them in crowds and that count for one to two percent of the population.

29 Décret n° 2012-652 du 4 mai 2012 relatif au traitement d'antécédents judiciaires <https://www.legifrance.gouv.fr/jorf/id/JORFTEXT000025803463> accessed 23 March 2023.

30 Assemblée Nationale, Rapport D’information déposé en application de l’article 145 du Règlement par la commission des lois constitutionnelles, de la législation et de l’administration générale de la république (2018) <https://www.assemblee-nationale.fr/dyn/15/rapports/cion_lois/l15b1335_rapport-information#P739_166902> accessed 1 May 2023.

31 CC Garnier, ‘Dans Tous les Commissariats de France, on Utilise la Reconnaissance Faciale’ (Street Press, 2021) <https://www.streetpress.com/sujet/1617723420-tous-commissariats-france-utilise-reconnaissance-faciale-police-gendarmerie-justice-surveillance-zad-squat-libertes-societe> accessed 12 February 2023.

For instance, the use of FRT allowed the location (and subsequent shooting) of Khamzat Azimov, who back in 2018 stabbed 5 people in a terrorist Islamic attack (C Simon and L Colcombet, ‘Attaque au Couteau à Paris: Azimov Identifié Grâce à la Reconnaissance Faciale’ (Le Parisien, 2018) <https://www.leparisien.fr/faits-divers/reconnaissance-faciale-comment-a-ete-identifie-l-assaillant-khamzat-azimov-14-05-2018-7715343.php> accessed 12 March 2023).

32 Decreto legislativo 18 maggio 2018, n. 51, recante l’Attuazione della direttiva (UE) 2016/680, <https://www.gazzettaufficiale.it/eli/id/2018/05/24/18G00080/sg>, accessed 12 March 2023.

33 Garante per la Protezione dei Dati Personali, Parere sul Sistema Sari Real Time (2021) <https://www.garanteprivacy.it/web/guest/home/docweb/-/docweb-display/docweb/9575877> accessed 12 March 2023.

34 Deutscher Bundestag, Drucksache 20/895 (2022) <https://dserver.bundestag.de/btd/20/008/2000895.pdf> accessed 2 January 2023.

See also a previous answer from 2021 in Deutscher Bundestag, Drucksache 19/29651 (14 May 2021) <https://dserver.bundestag.de/btd/19/296/1929651.pdf#page=42> accessed 23 January 2023.

36 Police forces in some jurisdictions exhibit a particularly high propensity to use these technologies; for example, police forces in Hong Kong have used FRT to identify pro-democracy activists.

37 European Union Agency for Fundamental Rights (n 6) 12.

38 On the concept of biometric data, EJ Kindt, Privacy and Data Protection Issues of Biometric Applications (Springer 2013) 15–272. Biometric facial data should not be confused with photographs not submitted to specific digital techniques (see Recital 51 of the GDPR and Recital 29 of Regulation (EU) 2018/1725 of the European Parliament and of the Council of 23 October 2018 on the protection of natural persons with regard to the processing of personal data by the Union institutions, bodies, offices and agencies and on the free movement of such data <https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A32018R1725> accessed 12 April 2023).

39 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) <https://eur-lex.europa.eu/eli/reg/2016/679/oj> accessed 2 April 2023.

On the protection of private data, see also the Data Protection ‘Convention 108’ and the Additional Protocol to the Convention for the Protection of Individuals with regard to Automatic Processing of Personal Data, regarding supervisory authorities and transborder data flows. For a comment on the application of the rights therein proclaimed to FRT, see Consultative Committee of the Convention for the Protection of Individuals with Regard to Automatic Processing of Personal Data Convention 108, Guidelines on Facial Recognition’ (2021) 15–16 <https://rm.coe.int/guidelines-on-facial-recognition/1680a134f3> accessed 2 April 2023.

40 Suggesting this hypothesis, EDRi, ‘Ban Biometric Mass Surveillance (A Set Of Fundamental Rights Demands for the European Commission and EU Member States)’ (2020) 25 <https://edri.org/our-work/blog-ban-biometric-mass-surveillance/>, accessed 3 May 2023.

41 Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA <https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A02016L0680-20160504> accessed 2 April 2023.

42 L Drechsler, ‘Wanted: LED Adequacy Decisions How the Absence of any LED Adequacy Decision Is Hurting the Protection of Fundamental Rights in a Law Enforcement Context’ (2021) International Data Privacy Law ipaa019. doi.org/10.1093/idpl/ipaa019.

43 P Vogiatzoglou and others, ‘From Theory to Practice: Exercising the Right of Access Under the Law Enforcement and PNR Directives’ (2021) 11 JIPITEC 274, 280.

44 This hypothesis is suggested, EDRi (n 40) 24.

On the rules about biometric data processing under the GDPR see Raposo (n 5).

45 As pointed out by EDRi (n 40) 25.

46 From a different perspective, it is possible to distinguish first-generation (photos, fingerprints, DNA samples) from second-generation biometrics (facial recognition software, remote iris recognition, voice pattern analysis). Cf. Scottish Government, ‘Code of Practice on The Acquisition, Use, Retention and Disposal of Biometric Data for Justice and Community Safety Purposes in Scotland (Draft for Public Consultation)’ (2018) 5 <https://www.gov.scot/binaries/content/documents/govscot/publications/consultation-paper/2018/07/consultation-enhanced-oversight-biometric-data-justice-community-safety-purposes/documents/00538315-pdf/00538315-pdf/govscot%3Adocument/00538315.pdf> accessed 1 April 2023.

47 On this norm, European Data Protection Board (n 9) 17–19.

48 For the particular case of its use in criminal investigations, see Article 10(1) of the GDPR.

49 See the interpretation of those rules in IN Rezende, ‘Facial Recognition in Police Hands: Assessing the ‘Clearview Case’ from a European Perspective’ (2020) 11(3) New Journal of European Criminal Law 375, 382–388.

50 Charter of Fundamental Rights of the European Union, <https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:12012P/TXT>, accessed 3 April 2023.

51 Although not specifically referring to the LED, see Case C-623/17 Privacy International v Secretary of State for Foreign and Commonwealth Affairs and Others [2020] ECLI:EU:C:2020:790, par 68.

52 Although not specifically referring to the LED, see Case C-311/18 Data Protection Commissioner v Facebook Ireland Ltd and Maximillian Schrems [2020] ECLI:EU:C:2020:559, pars175 and 180 (Schrems II).

53 European Data Protection Board (n 9) 18.

54 About the rights of the data subjects, ibid 21–24.

55 Vogiatzoglou and others (n 43) 286.

56 Ibid 286–287.

57 Ibid 288.

58 Ibid 288.

59 Ibid 288.

60 Article 29 Data Protection Working Party, ‘Opinion on Some Key Issues of the Law Enforcement Directive’, EU 2016/680 (2017) 8 <https://ec.europa.eu/newsroom/article29/items/610178> accessed 13 May 2023.

61 Mann and Smith (n 15) 124; B Stegner, ‘Facebook Photo Tagging Guide: Everything You Need to Know’, MUO (2020) <https://www.makeuseof.com/tag/3-things-you-need-to-know-about-photo-tagging-in-facebook/> accessed 20 April 2023.

62 European Data Protection Board (n 9) 19.

63 ‘Function creep’ is a concept generally employed to refer the use of a technology for a purpose other than that for which it was created. About this concept see B-J Koops, ‘The Concept of Function Creep’ (2021) 13(1) Law, Innovation and Technology, 29.

64 Article 29 Data Protection Working Party (n 60) 9.

65 Even if consent is not required, the data subject still has the right to receive information from the data controller, as in Article 13 of the LED.

66 A similar deduction in EDRi (n 40) 24.

67 Institute and International Association of Chiefs of Police (IACP), ‘Law Enforcement – Facial Recognition Use Case Catalogue’ (2019) 1 <https://www.theiacp.org/resources/document/law-enforcement-facial-recognition-use-case-catalog> accessed 22 April 2023.

68 European Union Agency for Fundamental Rights (n 6) 26.

69 According to the WP29 (Article 29 Data Protection Working Party, ‘Opinion 06/2014 on the Notion of Legitimate Interests of the Data Controller Under Article 7 of Directive 95/46/EC’ (2017) 8 <file:///Users/apple/Downloads/20171207_wp258_en_E27E05F3-B65B-B2BA-4D4BFC8EE7F8E9A4_48804.pdf> accessed 2 May 2023), this requisite imposes on law enforcement agencies the invocation of ‘precise and particularly solid justifications for the processing of such data’.

70 Athough not specifically referring to the LED, see Cases C–293/12 and C–594/12 Digital Rights Ireland and Others [2014] ECLI:EU:C:2014:238, par 54–55.

71 Y Bathaee, ‘The Artificial Intelligence Black Box and the Failure of Intent and Causation’ (2018) 31(2) Harvard Journal of Law & Technology 890; B Dickson, ‘The Dangers of Trusting Black-Box Machine Learning’ TechTalks (2020) <https://bdtechtalks.com/2020/07/27/black-box-ai-models/> accessed 12 May 2023 (nonetheless, some argue that this is not the case: C Rudin and J Radin, ‘Why Are We Using Black Box Models in AI When We Don’t Need To? A Lesson from an Explainable AI Competition’ (2019) 1(2) Harvard Data Science Review, doi.org/10.1162/99608f92.5a8a3a3d).

72 JM Meyers, ‘Artificial Intelligence and Trade Secrets’, American Bar Association (2019) <https://www.americanbar.org/groups/intellectual_property_law/publications/landslide/2018-19/january-february/artificial-intelligence-trade-secrets-webinar/> accessed 16 March 2023.

73 Referring the much higher level of threat to human rights, Dushi (n 3) 4.

74 V Toom, R Granja and A Ludwig, ‘The Prüm Decisions as an Aspirational Regime: Reviewing a Decade of Cross-Border Exchange and Comparison of Forensic DNA Data’ (2019) 41 Forensic Science International Genetics 50, doi.org/10.1016/j.fsigen.2019.03.023.

75 Proposal for a Regulation of the European Parliament and of the Council on automated data exchange for police cooperation (‘Prüm II’), amending Council Decisions 2008/615/JHA and 2008/616/JHA and Regulations (EU) 2018/1726, 2019/817 and 2019/818 of the European Parliament and of the Council, COM/2021/784 final <https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=COM%3A2021%3A784%3AFIN&qid=1639141496518> accessed 11 March 2023.

See also European Parliament, ‘Report on the proposal for a regulation of the European Parliament and of the Council on automated data exchange for police cooperation (“Prüm II”), amending Council Decisions 2008/615/JHA and 2008/616/JHA and Regulations (EU) 2018/1726, 2019/817 and 2019/818 of the European Parliament and of the Council’, 26.05.2023, <https://www.europarl.europa.eu/doceo/document/A-9-2023-0200_EN.html>, accessed 17 July 2023.

Cf. EDRi, ‘EDRi Challenges Expansion of Police Surveillance Via Prüm’ (2021) <https://edri.org/our-work/edri-challenges-expansion-of-police-surveillance-via-prum/> accessed 24 February 2023.

76 In detail see Article 29 Data Protection Working Party, ‘Guidelines 2/2020 on Articles 46 (2) (a) and 46 (3) (b) of Regulation 2016/679 for Transfers of Personal Data Between EEA and non-EEA Public Authorities And Bodies’ (2020), <https://edpb.europa.eu/sites/default/files/files/file1/edpb_guidelines_202002_art46guidelines_internationaltransferspublicbodies_v2_en.pdf>, accessed 12 May 2023.

77 Dushi (n 3) 8.

78 Treaty on the Functioning of the European Union <https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:12012E/TXT:en:PDF> accessed 3 April 2023.

79 Declaration on the Protection of Personal Data in the Fields of Judicial Cooperation in Criminal Matters and Police Cooperation, OJ 2012 C 326/337 (2012) <https://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:C:2012:326:FULL:EN:PDF> accessed 28 April 2023.

80 Some blurred examples are provided in Drechsler (n 42) 3.

81 See L Drechsler, ‘The Achilles Heel of EU Data Protection in a Law Enforcement Context: International Transfers Under Appropriate Safeguards in the Law Enforcement Directive’, Cybercrime: New Threats, New Responses (Proceedings of the XV International Conference on Internet, Law & Politics. Universitat Oberta de Catalunya, Barcelona, 1–2 July 2020), (Huygens Editorial, 2020) 51–54 <https://ssrn.com/abstract=3664125> accessed 3 April 2023.

82 Derogations to these requisites can be found in Art 35(1)(c) and 39(1) of the LED.

83 See the demand for an ‘adequate level of protection’ in both the Schrems I (Case C-362/14 Maximillian Schrems v Data Protection Commissioner [2015] ECLI:EU:C:2015:650) and Schrems II (n 52) decisions in light of the GDPR.

84 About adequacy decisions under the LED, Drechsler (n 81) 52–53.

85 About adequate safeguards under the LED, Drechsler (n 81) 53–57.

86 Schrems II (n 52) par. 92–96.

87 This critic in Drechsler (n 42) 8.

Note that the Schrems decision does not directly affect the LED, as this one pertains to the processing of personal data by law enforcement authorities for the purpose of preventing, investigating, detecting, and prosecuting criminal offences. However, it has implications for law enforcement authorities as well, which are also requires to ensure that any personal data transferred outside the EU is adequately protected and in compliance with EU data protection laws, including the GDPR and the LED.

88 Drechsler (n 42) 8.

A detailed analysis of data transfer for law enforcement between the EU and the US in T Christakis and F Terpan, ‘EU – US Negotiations on Law Enforcement Access to Data: Divergences, Challenges and EU Law Procedures and Options’ (2021) 11(2) International Data Privacy Law 81, doi.org/10.1093/idpl/ipaa022.

89 Agreement between the United States of America and the European Union on the protection of personal information relating to the prevention, investigation, detection, and prosecution of criminal offences, OJ 2016 L 336/3 (Umbrella Agreement) <https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:22016A1210(01)&rid=3> accessed 11 May 2023.

90 Some of these advantages are set out in Drechsler (n 42) 10.

The author notes that adequacy decisions are also relevant for international data transfers carried out by Europol, Eurojust and the European Public Prosecutor’s Office, whose existing arrangements might not pass the scrutiny of the CJEU (see Drechsler (n 42) 11–12).

91 European Commission, Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, Brussels, 19.2.2020 COM(2020) 66 final (2020) <https://ec.europa.eu/info/sites/default/files/communication-european-strategy-data-19feb2020_en.pdf> accessed 25 February 2023.

92 Convention for the Protection of Human Rights and Fundamental Freedoms <https://www.echr.coe.int/documents/d/echr/Convention_ENG> 2 April 2023.

93 This is a particular concern, considering the role that privacy plays in European law (B Petkova, ‘Privacy as Europe's First Amendment’ (2019) 25 Eur Law J. 140, doi.org/10.1111/eulj.12316).

94 Cf. ‘The Limits of State Intervention: Personal Identity and Ultra-Risky Actions’ (1976) 85(6) The Yale Law Journal 826.

95 M Hirose, ‘Privacy in Public Spaces: The Reasonable Expectation of Privacy Against the Dragnet Use of Facial Recognition Technology’ (2016) 49 Conn. L. Rev. 1591.

96 Katz v United States (1967, 389 U.S. 347).

97 This ground led to the identification in US caselaw of various scenarios where there are no expectations of privacy: ‘person travelling in an automobile on public thoroughfares’ (United States v. Knotts, 460 U.S. 276, 281–82, 1983), ‘physical characteristics … constantly exposed to the public’ such as the voice or the face (United States v. Dionisio, 410 U.S. 1, 1973). Cf. Congressional Research Service, ‘Facial Recognition Technology and Law Enforcement: Select Constitutional Considerations’ (2020) 12–16 <https://crsreports.congress.gov/product/pdf/R/R46541> accessed 23 March 2023.

98 Bărbulescu v Romania (App no 61496/08) ECHR [GC] 5 September 2017 par.73; Benedik v Slovenia (App no 62357/14) ECHR 24 April 2018 par. 101.

99 S. v United Kingdom (Marper) (App no 30562/04 and 30566/04) ECHR [GC] 4 December 2008.

100 Ibid par. 125.

There are decisions from UK courts about the maintenance of biometric data based on the norms of the ECHR. In R (on the application of Wood) v Metropolitan Police Commissioner, [2009] 4 All ER 951, the court analysed the case in light of article 8 of the ECHR and the principle of proportionality, with a particular focus on the fact that the plaintiff had not committed any crime. There is a comment on this case in Mann and Smith (15) 134–135.

101 M Madianou, ‘The Biometric Assemblage: Surveillance, Experimentation, Profit, and the Measuring of Refugee Bodies’ (2019) 20(6) Television & New Media.

102 V Mitsilegas and others, ‘Data Retention and the Future of Large-Scale Surveillance: The Evolution and Contestation of Judicial Benchmarks’ (2022) Europen Law Journal, doi:10.1111/eulj.12417.

103 NA Softness, ‘Social Media and Intelligence: The Precedent and Future for Regulations’ (2017) 34(1) American Intelligence Journal 32.

104 VL Raposo, ‘You Can Run, But You Can’t Hide: Digital State Surveillance in Liberal Democracies’, The Digital Constitutionalist (2022) <https://digi-con.org/you-can-run-but-you-cant-hide/> accessed 14 February2023; Y-L Liu, Wenjia Yan, and Bo Hu, ‘Resistance to Facial Recognition Payment in China: The Influence of Privacy-Related Factors’ (2021) 45(5) Telecommunications Policy 102155.

105 The peril of evolution (or a de-evolution) to the Chinese model is highlighted by I Nesterova, ‘Mass Data Gathering and Surveillance: The Fight Against Facial Recognition Technology in the Globalized World’ (2020) 74 SHS Web Conf, 03006, doi.org/10.1051/shsconf/20207403006.

106 ‘In a sense, liberal democracies were forced to become less liberal to survive as such. This is certainly a compelling reason to accept ‘some’ State surveillance (…) It won’t be the first time that laudable purposes (the preservation of liberal democracies) lead to tragic outcomes (the very extinction of liberal democracies)’. Cf. Raposo (n 104).

107 Such as the EDPB (European Data Protection Board, Letter from the EDPB to the European Parliament, Ref: OUT2020-0052, (2020) <https://edpb.europa.eu/sites/default/files/files/file1/edpb_letter_out_2020-0052_facialrecognition.pdf> accessed 26 April 2023), the European Agency for Fundamental Rights (n 6) and EDRi (n 40).

108 About individual privacy and group privacy, RÁ Costello, ‘Genetic Data and the Right to Privacy: Towards a Relational Theory of Privacy?’ (2022) 22(1) Human Rights Law Review 1, 5–9.

109 EDRi (n 6) 8.

110 European Union Agency for Fundamental Rights (n 6) 4.

111 About mistaken identification, VL Raposo, ‘When Facial Recognition Does Not ‘Recognise’ – Erroneous Identifications and Resulting Liabilities’ (2023) AI & Society, doi.org/10.1007/s00146-023-01634-z.

112 European Digital Rights, ‘Facial Recognition and Fundamental Rights’ (2019), <https://edri.org/our-work/facial-recognition-and-fundamental-rights-101/>, accessed 14 January 2023; European Union Agency for Fundamental Rights, ‘#BigData. Discrimination in Data-Supported Decision Making’ (2018) <https://fra.europa.eu/en/publication/2018/bigdata-discrimination-data-supported-decision-making> accessed 4 April 2023.

113 For the particular case of privacy rights see Digital Rights Ireland and Others (n 70).

114 K Lenaerts, ‘Limits on Limitations: The Essence of Fundamental Rights in the EU’ (2019) 20(6) German Law Journal 779.

115 Joined cases C-511/18, C-512/18 and C-520/18 La Quadrature du Net and others ECLI:EU:C:2020:791, par 131.

116 Specifically on privacy rights, see S. v United Kingdom (Marper) (n 99) par 95–104.

117 Even though the United Kingdom (UK) is no longer part of the EU, its laws on data protection closely follow EU norms. It is, therefore, worthwhile to consider the UK’s experience with the regulation of FRT. The British police were the first in Europe to employ FRT, which they used with only moderate success at the UEFA Champions League final in June 2017. Since then, FRT has been used more regularly, though such usage has sometimes been contested. A notorious case involved a citizen who brought before the court the use of FRT after being caught by a camera when passing by in a public place. Initially, the High Court found that although there was an infringement of the right to privacy, as set out in the European Convention of Human Rights (ECHR), the use of such technology was lawful because it was necessary, proportionate and non-discriminatory (R (Bridges) v Chief Constable of the South Wales Police [2019] EWHC 2341 Admin). This ruling cannot be seen as a general acceptance of the use of FRT by police forces, as it was largely accepted that the Court’s assessment was restricted to those particular circumstances. Still, on appeal, the Court of Appeal decided otherwise, claiming that the use of FRT in that context was unlawful and a violation of Article 8 of the ECHR. The Court of Appeal also expressed concern about the discretionary powers of the police, particularly their power to decide who could be placed on a watchlist and when FRT should be deployed (Denham (n 22)). The UK Information Commissioner also intervened in this case and claimed that when FRT ‘involves large-scale and relatively indiscriminate processing of personal data’, it constitutes a ‘serious interference’ with privacy rights (M Burgess, ‘Inside the Urgent Battle to Stop UK Police Using Facial Recognition’, Wired (2019) <https://www.wired.co.uk/article/uk-police-facial-recognition> accessed 19 April 2023). In the aftermath of this decision, the UK Information Commissioner, Elizabeth Denham, suggested the drawing up of a code of conduct to bind law enforcement agents when using FRT (BBC, ‘Facial Recognition Technology Code Of Conduct Call’, BBC News (2019) <https://www.bbc.com/news/uk-wales-50251643> accessed 22 February 2023).

118 O’Flaherty (n 13) 172.

119 Schrems I, Advocate General’s Opinion, 23 September 2015, ECLI:EU:C:2015:627.

120 European Commission, Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts, Brussels, 21.4.2021 COM(2021) 206 final 2021/0106 (COD) <https://digital-strategy.ec.europa.eu/en/library/proposal-regulation-laying-down-harmonised-rules-artificial-intelligence> accessed 3 May 2023.

A comment to this proposal in VL Raposo, ‘Ex Machina: Preliminary Critical Assessment of the European Draft Act on Artificial Intelligence’ (2022) 30(1) International Journal of Law and Information Technology 88, doi.org/10.1093/ijlit/eaac007.

121 European Commission, ‘Shaping Europe’s Digital Future’ (2021), <https://digital-strategy.ec.europa.eu/en>, accessed 9 May 2023.

122 European Commission, ‘Structure for A White Paper on Artificial Intelligence, a European Approach’, draft as of 12/12 (2020) <https://www.politico.eu/wp-content/uploads/2020/01/AI-white-paper-CLEAN.pdf> accessed 26 March 2023.

123 ‘It follows that, in accordance with current EU data protection rules and the Charter of Fundamental Rights, AI can only be used for remote biometric identification purposes where such use is duly justified, proportionate and subject to adequate safeguards’ (European Commission, ‘White Paper On Artificial Intelligence – A European Approach to Excellence and Trust,’ Brussels, COM(2020) 65 final, (2020) 22 <https://ec.europa.eu/info/sites/info/files/commission-white-paper-artificial-intelligence-feb2020_en.pdf> accessed 2 April 2023).

124 Ibid 21 ff.

125 European Commission (n 120).

126 Art 5(1)(d) of the AIA bans ‘real-time remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement’.

127 European Council, ‘Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts – General approach’, 25 November 2022, <https://artificialintelligenceact.eu/wp-content/uploads/2022/12/AIA-%E2%80%93-CZ-%E2%80%93-General-Approach-25-Nov-22.pdf>, accessed 12 April 2023.

128 European Parliament, ‘Amendments adopted by the European Parliament on 14 June 2023 on the proposal for a regulation of the European Parliament and of the Council on laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts (COM(2021)0206 – C9-0146/2021–2021/0106(COD)’, 14 June 2023, <https://artificialintelligenceact.eu/wp-content/uploads/2023/06/AIA-%E2%80%93-IMCO-LIBE-Draft-Compromise-Amendments-14-June-2023.pdf>, accessed 9 July 2023.

129 One of the changes relates to a newly added prohibition of ‘the placing on the market, putting into service or use of AI systems to infer emotions of a natural person in the areas of law enforcement, border management, in workplace and education institutions’ (Article 5(1)(dc) of the AIA, Parliament’s version), which might also take place by FRT. However, because such a case does not involve the identification of the individual, it falls outside the scope of this paper.

130 Article 5(1)(d) AIA, Parliament’s version: ‘the use of “real-time” remote biometric identification systems in publicly accessible spaces’.

131 Article 5(1)(dd) of the AIA, Parliament’s version:

the putting into service or use of AI systems for the analysis of recorded footage of publicly accessible spaces through ‘post’ remote biometric identification systems, unless they are subject to a pre-judicial authorisation in accordance with Union law and strictly necessary for the targeted search connected to a specific serious criminal offense as defined in Article 83(1) of TFEU that already took place for the purpose of law enforcement.

132 F Reinhold and A Mülle, ‘AlgorithmWatch’s Response to the European Commission’s Proposed Regulation on Artificial Intelligence – A Major Step with Major Gaps’ (Algorithm Watch, 2021), <https://algorithmwatch.org/en/response-to-eu-ai-regulation-proposal-2021/>, accessed 10 May 2023; J Scipione, ‘Has the Horse Bolted? Dealing with Legal and Practical Challenges of Facial Recognition’ (MediaLaws 2022), <SSRN: https://ssrn.com/abstract=4019105>, accessed 16 March 2023; Raposo (n 5).

133 Council Framework Decision of 13 June 2002 on the European arrest warrant and the surrender procedures between member states – Statements made by certain member states on the adoption of the Framework Decision, 2002/584/JHA, <https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A32002F0584>, accessed 12 March 2023.

134 However, prior judicial authorisation can be postponed in cases of urgency, a solution that might be prone to abuse (Raposo (n 120) 96).

135 Ibid 95–96.

136 Also, it should be noted that the AIA does not regulate international cooperation in law enforcement, including in the domain of FRT (Article 2(4) of the AIA). Cf. European Data Protection Board – European Data Protection Supervisor, ‘Joint Opinion 5/2021 on the Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act)’ (2021) 9, <https://edpb.europa.eu/system/files/2021-06/edpb-edps_joint_opinion_ai_regulation_en.pdf>, accessed 6 March 2023.

137 Ibid.

138 The norms on criminal procedure and due process, which are mostly issued by national states, must be added to this.

139 Cf. Raposo 2022 (n 5).

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 596.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.