426
Views
0
CrossRef citations to date
0
Altmetric
Articles

Increasing transparency around facial recognition technologies in law enforcement: towards a model framework

Pages 66-84 | Published online: 30 Aug 2023
 

ABSTRACT

Law enforcement authorities around the world are increasingly trialing or using facial recognition technologies (FRT). Their use has raised many legal and ethical challenges, one of which is a lack of transparency: Community members do not have sufficient information about what government organizations use FRT, for what purposes, and what safeguards are in place to manage the risks that they pose to human rights. While increased transparency around FRT use has been demanded by policy makers and academics, there are no established guidelines on how much transparency is needed around different FRT (authentication, identification, categorization) and what the barriers are in achieving the expected levels of transparency. This article fills this gap by proposing criteria which would help determine the required levels of transparency for different FRT applications (both low-risk and high-risk ones) and examines organizational, technical, legal and operational barriers in achieving adequate transparency and how they could be addressed.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 Facial recognition technology (FRT): 100 countries analyzed. (2021, June 8). Comparitech. https://www.comparitech.com/blog/vpn-privacy/facial-recognition-statistics/#:~:text=Five%20countries.

2 Facial recognition technology (FRT): 100 countries analyzed. (2021, June 8). Comparitech. https://www.comparitech.com/blog/vpn-privacy/facial-recognition-statistics/#:~:text=Five%20countries%E2%80%93China%2C%20Russia%2C.

3 Australia: Gillespie, E. (2019, March 4). Are you being scanned? How facial recognition technology follows you, even as you shop. The Guardian; The Guardian. https://www.theguardian.com/technology/2019/feb/24/are-you-being-scanned-how-facial-recognition-technology-follows-you-even-as-you-shop ; Canada, O. of the P. C. of. (2021, June 10). Police use of Facial Recognition Technology in Canada and the way forward. Www.priv.gc.ca. https://www.priv.gc.ca/en/opc-actions-and-decisions/ar_index/202021/sr_rcmp/; Italy introduces a moratorium on video surveillance systems that use facial recognition. (n.d.). European Digital Rights (EDRi). Retrieved February 16, 2022, from https://edri.org/our-work/italy-introduces-a-moratorium-on-video-surveillance-systems-that-use-facial-recognition/; Statewatch | France: Legal action against police facial recognition technology. (n.d.). Www.statewatch.org. Retrieved February 16, 2022, from https://www.statewatch.org/news/2020/september/france-legal-action-against-police-facial-recognition-technology/; UK police forces testing new retrospective facial recognition that could identify criminals. (2021, July 31). Inews.co.uk. https://inews.co.uk/news/technology/uk-police-testing-retrospective-facial-recognition-identify-criminals-1128711.

4 Facial Recognition Software Prompts Privacy, Racism Concerns in Cities and States. (n.d.). Pew.org. Retrieved February 16, 2022, from https://www.pewtrusts.org/en/research-and-analysis/blogs/stateline/2019/08/09/facial-recognition-software-prompts-privacy-racism-concerns-in-cities-and-states; C Garvie and J Frankle (2016, April 7). Facial-Recognition Software Might Have a Racial Bias Problem. The Atlantic; The Atlantic. https://www.theatlantic.com/technology/archive/2016/04/the-underlying-bias-of-facial-recognition-systems/476991/; J Buolamwini (2019, April 24). Response: Racial and Gender bias in Amazon Rekognition – Commercial AI System for Analyzing Faces. Medium. https://medium.com/@Joy.Buolamwini/response-racial-and-gender-bias-in-amazon-rekognition-commercial-ai-system-for-analyzing-faces-a289222eeced

5 C Garvie, AM Bedoya, and J Frankle, The Perpetual Line-Up: Unregulated Police Face Recognition in America (2016), Georgetown l. Ctr. On priv. & tech. https://www.perpetuallineup.org/sites/default/files/2016-12/The%20Perpetual%20Line-Up%20-%20Center%20on%20Privacy%20and%20Technology%20at%20Georgetown%20Law%20-%20121616.pdf.

6 E.g. Jennifer Lynch, ‘Face Off: Law Enforcement Use of Face Recognition Technology’ (April 20, 2020). Available at SSRN: https://ssrn.com/abstract=3909038 or http://doi.org/10.2139/ssrn.3909038, at 27; B Nober (2020). ‘A Call for Transparency in Law Enforcement Use of Facial Recognition’, Northwestern Undergraduate Research Journal. https://doi.org/10.21985/n2-nzpa-cv38; Christopher Jones, ‘Law Enforcement Use Of Facial Recognition: Bias, Disparate Impacts On People Of Color, And The Need For Federal Legislation’ 22 N.C. J. L. & Tech. 777 (2021).

7 See e.g. NSW Ombudsman, ‘The new machinery of government: using machine technology in administrative decision-making’ (State of New South Wales 29 November 2021) <www.ombo.nsw.gov.au/Find-a-publication/publications/reports/state-and-local-government/the-new-machinery-of-government-using-machine-technology-in-administrative-decision-making> accessed 15 September 2022; European Ombudsman, ‘Report on the meeting between European Ombudsman and European Commission representatives’ (19 November 2021) <www.ombudsman.europa.eu/en/doc/inspection-report/en/149338> accessed 15 September 2022.

8 World Economic Forum (WEF) and others, A Policy Framework for Responsible Limits on Facial Recognition Use Case: Law Enforcement Investigations, Insight Report (Revised), 2022, https://www3.weforum.org/docs/WEF_Facial_Recognition_for_Law_Enforcement_Investigations_2022.pdf.

9 Council of Europe, Guidelines on Facial Recognition, 28 January 2021, T-PD(2020)03rev4, https://rm.coe.int/guidelines-on-facial-recognition/1680a134f3.

10 European Data Protection Board (EDPB), Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, version 1, 12 May 2022, https://edpb.europa.eu/system/files/2022-05/edpb-guidelines_202205_frtlawenforcement_en_1.pdf.

11 E.g., Sonia Hickey, Australia: Police accused of lying about use of ineffective facial recognition software, Mondaq, 09 March 2020, https://www.mondaq.com/australia/crime/902056/police-accused-of-lying-about-use-of-ineffective-facial-recognition-software.

12 Hannah Bloch-Wehba, ‘Visible policing: Technology, Transparency, and Democratic Control’, 109 Calif. L. Rev. 917 (2021), 957.

13 Access Now, ‘Snapshot Report: Europe‘s Approach to AI: How AI Strategy is Evolving’, 2020 https://www.accessnow.org/cms/assets/uploads/2020/12/Report-Snapshot-Europes-approach-to-AI-How-AI-strategy-is-evolving-1.pdf p. 3.

14 See e.g., Christopher Hood, ‘Transparency in Historical Perspective’ in Christopher Hood and David Heald (eds), Transparency: The Key to Better Governance? (Oxford University Press 2006) 3.

15 Albert Meijer, ‘Transparency’ in Mark Bovens, Robert E Goodin and Thomas Schillemans (eds), The Oxford Handbook of Public Accountability (Oxford University Press 2014) 507, 511 (emphasis omitted), cited from Bennett Moses, Lyria; Louis de Koker, ‘Open Secrets: Balancing Operational Secrecy and Transparency in the Collection and Use of Data by National Security and Law Enforcement Agencies’ (2017) 41(2) Melbourne University Law Review 530, 535.

16 Bennett Moses and de Koker, supra note 15, 535; see also Ann Florini, ‘Introduction: The Battle over Transparency’ in Ann Florini (ed), The Right to Know: Transparency for an Open World (Columbia University Press 2007) 1, 2. See also Alasdair Roberts, ‘Transparency in the Security Sector’ in Ann Florini (ed), The Right to Know: Transparency for an Open World (Columbia University Press 2007) 309, 3213.

17 Bennett Moses and de Koker, supra note 15, 535.

18 Bennett Moses and de Koker, supra note 15, 535–36.

19 Bloch-Wehba, supra note 12, at 925.

20 Mary D. Fan, Privacy, ‘Public Disclosure, Police Body Cameras: Policy Splits’, 68 Ala. L. Rev. 395, 410–411 (2016)

21 Jones, supra note 6, at 805.

22 Jones, supra note 6, at 805–06.

23 Bennett Moses and de Kroker, supra note 15, at 540.

24 Bloch-Wehba, supra note 12, at 925.

25 Bloch-Wehba, supra note 12, at 925.

26 Diogo V. Carvalho, Eduardo M. Periera and Jaime S. Cardozo, ‘Machine Learning Interpretability: A Survey on Methods and Metrics’ (2019) 8:8 Electronics at 5–7; Leilani H. Gilpin et al, ‘Explaining Explanations: An Overview of Interpretability of Machine Learning’ Computer Science and AI Laboratory (MIT, 2019).

27 Defenseur des Droits, Algorithms: Preventing Automated Discrimination, 2020 https://www.defenseurdesdroits.fr/sites/default/files/atoms/files/synth-algos-en-num-16.07.20.pdf p. 9.

28 OECD, AI Principles, Principle 7, https://oecd.ai/en/dashboards/ai-principles/P7.

29 See Commission, ‘Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative Acts’ Com (2021) 206 Final, para 38 (draft EU AI Act).

30 Bloch-Wehba, supra note 12, at 925, see also David E. Pozen, ‘Freedom of Information Beyond the Freedom of Information Act’, 165 U. Pa. L. Rev. 1097, 1102 (2017); David E. Pozen, ‘Transparency’s Ideological Drift’, 128 Yale L.J. 100, 151 (2018) [ (arguing that the Freedom of Information Act aggravates a ‘mounting adversarialism’ between government and public); Mark Fenster, ‘The Opacity of Transparency’, 91 Iowa L. Rev. 885, 932 (2006) (expressing skepticism that open government laws can address ‘populist fears of secrecy, especially those that are deep-seated and lead to an all-encompassing distrust of the political order’).

31 See Pozen, 2018, supra note 30, at 123; Mark Fenster, ‘Seeing the State: Transparency as Metaphor’, 62 Admin. L. Rev. 617, 628 (2010); Julie E. Cohen, ‘The Inverse Relationship Between Secrecy and Privacy’, 77 Soc. Rsch. 883, 890–91 (2010); Kate Levine, ‘Discipline and Policing’, 68 Duke L.J. 839, 854 (2019) (recognizing many scholars ‘have argued that there are serious tradeoffs that come with overreliance on visibility’).

32 Mike Ananny and Kate Crawford, ‘Seeing without Knowing: Limitations of the Transparency Ideal and Its Application to Algorithmic Accountability’ (2016) 20(3) New Media and Society, 1.

33 Onora O’Neill, A Question of Trust: The BBC Reith Lectures 2002 (Cambridge University Press 2002).

34 Christopher C Hood, ‘Transparency’ in Paul Barry Clarke and Joe Foweraker (eds), Encyclopedia of Democratic Thought (Routledge, 2001), 7034.

35 Bloch-Wehba, supra note 12, at 929.

36 ibid.

37 WEF et al, supra note 8, Principle 3.

38 Council of Europe, Convention for the protection of individuals with regard to the processing of personal data, June 2018 (Convention 108+).

39 Council of Europe, supra note 9, at 20.

40 Council of Europe, supra note 9, at 23.

41 European Data Protection Board, supra note 10, at 5.

42 Human Technology Institute, Facial Recognition Technology: Towards a Model Law, 2022, https://www.uts.edu.au/sites/default/files/2022-09/Facial%20recognition%20model%20law%20report.pdf, at 62.

43 WEF/Interpol, supra note 8, Principle 3.

44 WEF/Interpol, supra note 8, Principle 3, Council of Europe, supra note 9, at 20; European Data Protection Board, supra note 10, at 5; Human Technology Institute, supra note 42, at 62.

45 WEF/Interpol, supra note 8, Principle 3.

46 WEF/Interpol, supra note 8, Principle 3; Council of Europe, supra note 9, at 20.

47 Council of Europe, supra note 9, at 23.

48 E.g. Meijer, at 57, Bennet Moses and de Koker, at 535.

49 O’Neill, supra note 33.

50 ibid.

51 See, e.g., EDPB, supra note 10, Human Technology Institute, supra note 42.

52 E.g. M Mann and M Smith, ‘Automated Facial Recognition Technology: Recent Developments and Approaches to Oversight’ (2017) 40(1) The University of New South Wales Law Journal 121–145; Neroni Rezende, I. (2022), ‘Facial Recognition for Preventive Purposes: The Human Rights Implications of Detecting Emotions in Public Spaces’ in Winter L. Bachmaier and S Ruggeri (eds) Investigating and Preventing Crime in the Digital Era. Legal Studies in International, European and Comparative Criminal Law (vol 7. Springer, Cham). https://doi.org/10.1007/978-3-031-13952-9_4.

53 O’Neill, supra note 33.

55 The following organizations are especially active in FRT space and would be interested in in-depth information about technologies: European Digital Rights Organization, Access Now, American Civil Liberties Union (ACLU), Choice (Australia).

56 Eg Ali Akbari, ‘FRT 101: Technical insights’ in Rita Matulionyte and Monika Zalnieriute (eds), Facial Recognition in the Modern State (forthcoming, Cambridge University Press)

57 ibid.

58 Draft EU AI Act, supra note 29, articles 5(1)(d) and 6, Annex 3.

59 Nessa Lynch and Andrew Chen, Facial Recognition Technologies: Considerations for Use in Policing, 2021 https://www.police.govt.nz/sites/default/files/publications/facial-recognition-technology-considerations-for-use-policing.pdf, at 72–74 ; see also Nessa Lynch, Liz Campbell, Joe Purshouse, Marcin Betkier, Facial Recognition Technology in New Zealand, Towards a Legal and Ethical Framework, The Law Foundation, 2020 https://www.wgtn.ac.nz/__data/assets/pdf_file/0010/1913248/Facial-Recognition-Technology-in-NZ.pdf.

60 Lynch and Chen, supra note 60, at 72.

61 Such risks could include unnecessary and disproportional intervention with privacy (e.g. when authentication can be performed by other similarly effective means, and there is no opt out option), insufficient measures to ensure security of data, unclear or inappropriate rules on when the collected faceprints could be saved, for how long, how they could be shared, etc.

62 E.g. Access Now, Reclaim Your Face from Surveillance: Ban Facial Recognition, 23 February 2021 https://www.accessnow.org/reclaim-your-face-ban-biometric-mass-surveillance/.

63 Draft EU AI Act, supra note 29, article 5(1)(d).

64 See Information categories identified above.

65 Council of Europe, supra note 9, at 20.

66 See e.g. footnotes 8–10 above.

67 E.g., Police release findings from independent expert review of Facial Recognition Technology (New Zealand), 9 December 2021, National News, https://www.police.govt.nz/news/release/police-release-findings-independent-expert-review-facial-recognition-technology.

69 See e.g. Upol Ehsan and others ‘Expanding Explainability: Towards Social Transparency in AI systems’ (Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, May 2021) ; Alejandro Barredo Arrieta and others, ‘Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI’ (2020) 58 Information Fusion 82.

70 Matulionyte et al., ‘Should AI-enabled Medical Devices be Explainable?’ (2022) 30(2) International Journal for Law and Information Technology 151.

71 See Arrieta et al, supra note 66.

72 See also Bennett Moses and de Koker, supra note 15.

73 See e.g., LivePerson, Inc. v. 24/7 Customer, Inc., 83 F. Supp. 3d 501, 514 (S.D.N.Y. 2015) (finding algorithms based on artificial intelligence eligible for trade secret protection).

74 E.g. US Freedom of Information Act Guide, May 2004, Exemption 4.

75 R Matulionyte and T Abramovich, ‘AI Explainability and Trade Secrets’ in R Abbot (ed) Research Handbook on Artificial Intelligence and Intellectual Property (Edward Elgar 2022) 404–21.

76 See Sharon K. Sandeen and Tanya Aplin, ‘Trade Secrecy, Factual Secrecy and the Hype Surrounding AI’ in Ryan Abbott (ed) Research Handbook on Intellectual Property and Artificial Intelligence (Edward Elgar 2022); see also Camilla A. Hrdy and Mark A. Lemley, ‘Abandoning trade secrets’ (2021) 73(1) Stanford L Rev 1.

77 E.g. Brennan Ctr. for Justice v. New York City Police Dept., No. 160541/2016, 2017 WL 6610414, at *9 (N.Y. Sup. Ct. Dec. 27, 2017).

78 See Jake Goldenfein, 'Algorithmic Transparency and Decision-Making Accountability: Thoughts for buying machine learning algorithms' in Office of the Victorian Information Commissioner (ed), Closer to the Machine: Technical, Social, and Legal aspects of AI (2019), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3445873

79 Bennett Moses and de Koker, supra note 15, 538–39.

80 Bennett Moses and de Koker, supra note 15, at 543; See Tal Z Zarsky, ‘Transparent Predictions’ [2013] University of Illinois Law Review 1503, 1553–8.

81 Cited in Bennett Moses and de Koker, supra note 15, at 543.

82 See Metropolitan Police website on facial recognition: https://www.met.police.uk/advice/advice-and-information/fr/facial-recognition.

83 Bennett Moses and de Koker, supra note 15, at 542–43.

84 WEF/Interpol, supra note 8, Principle 3.

85 Council of Europe, supra note 9, at 20.

86 Bennett Moses and de Koker, supra note 15, 555–56.

87 WEF/Interpol, supra note 8, principle 3; Article 11 of Convention 108+.

88 Article 11 of Council of Europe Convention 108+, supra note 38.

89 ibid.

Additional information

Funding

This paper is a part of the project funded by the Lithuanian Research Council agreement number S-MIP-21-38.

Log in via your institution

Log in to Taylor & Francis Online

PDF download + Online access

  • 48 hours access to article PDF & online version
  • Article PDF can be downloaded
  • Article PDF can be printed
USD 53.00 Add to cart

Issue Purchase

  • 30 days online access to complete issue
  • Article PDFs can be downloaded
  • Article PDFs can be printed
USD 596.00 Add to cart

* Local tax will be added as applicable

Related Research

People also read lists articles that other readers of this article have read.

Recommended articles lists articles that we recommend and is powered by our AI driven recommendation engine.

Cited by lists all citing articles based on Crossref citations.
Articles with the Crossref icon will open in a new tab.