251
Views
0
CrossRef citations to date
0
Altmetric
Research Article

From object obfuscation to contextually-dependent identification: enhancing automated privacy protection in street-level image platforms (SLIPs)

, , &

ABSTRACT

Street-level image platforms (SLIPs) employ indiscriminate forms of data collection that include potentially privacy invasive images. Both the scale and the indiscriminate nature of data collection means that significant privacy management requirements are needed. Legal risk management is currently operated through obfuscation techniques involving certain image objects. Current SLIP object obfuscation solutions are an indiscriminate and a blunt solution to a similarly indiscriminate data collection concern. A new contextual approach to obfuscation is required that goes beyond object obfuscation. Contextually-dependent identification would seek to identify the contexts, including captured objects, which can give rise to privacy concerns. It is technically more challenging for automated solutions as it requires an assessment of the contextual situation to understand privacy risk. Context-sensitive privacy detection, combined with context-sensitive privacy-by-design processes, potentially offer a risk management solution that better situates and addresses the concerns arising from SLIP data collections.

I. Introduction

Obfuscation, in its humble, provisional, better-than-nothing, socially contingent way, is deeply entangled with the context of use.Footnote1

Brunton and Nissenbaum’s quote highlights the challenges that arise for street-level image platforms (SLIPs) which have emerged over the last two decades.Footnote2 SLIPs combine online mapping technologies with 360-degree, panoramic street level views that feature inbuilt track, pan and zoom capabilities.Footnote3 While data collection techniques vary, imagery for these products is typically gathered using a fleet of vehicles equipped with specialised cameras that capture images to be ‘sewn’ together into a navigable interface.Footnote4 Image data collection is thus largely indiscriminate because SLIP cameras capture street level photography, at a given point in time, including persons and other common objects found in the everyday life of global human societies.

Such indiscriminate collections of street-level imagery give rise to significant privacy concerns that require SLIPs to implement privacy risk management strategies across a vast trove of image data. Legal risks are somewhat mitigated by the ‘public’ nature of image data, which privacy law has traditionally accorded a lower degree of protection.Footnote5 However, it is still possible for SLIPs to give rise to legal issues under privacy torts and data protection law frameworks. Not surprisingly, then, advances in street-level image capture have attracted privacy concerns across many jurisdictions, particularly in relation to the depiction of identifiable individuals, their facial features,Footnote6 vehicle registration platesFootnote7 and residential homes.Footnote8 These complaints give rise to a new form of automated legal risk management solution, obfuscation of partial or complete object imagery that provide privacy protections by blurring features to make them unrecognisable online.

Obfuscation of object imagery has become the primary privacy risk management tool for SLIPs. It thus provides the type of ‘better-than-nothing, socially contingent way’ of resolving the complex privacy issues that arise from the indiscriminate collection of global street imagery. However, as Brunton and Nissenbaum highlight, obfuscation is still ‘deeply entangled in the context of use.’Footnote9 Context, in relation to SLIP data collections and obfuscation, has two connotations. First, acts of obfuscation arise within a ‘context of unavoidable relationships between people and institutions with large informational and power asymmetries.’Footnote10 In other words, context can render obfuscated individuals identifiable. The second connotation is that privacy is a ‘multi-faceted concept’ with a ‘wide range of structures, mechanisms, rules and practices … to produce and defend it.’Footnote11 The types of privacy rights and expectations enlivened by a particular technology will be shaped by the context of use. Obfuscation is therefore but one tool in a considerably complex toolbox that is designed to provide a legal risk management solution to meet many different privacy rules and contexts.

Our paper argues that SLIPs are currently not using all the tools available in the privacy toolbox due to a restricted understanding of privacy concerns and the favouring of one convenient tool, object obfuscation, at the expense of the more complex, but comprehensive consideration of contextually dependent identification. Part II identifies and categorises the key failings of object obfuscation that arise from the capture, aggregation, and disclosure of digital mapping imagery. Part III provides an overview of how privacy issues are generally treated under two key areas of law: privacy torts and data protection legislation. Part IV then develops contextually-dependent technical and legal solutions to shift SLIP obfuscation strategies from a ‘humble, provisional, better-than-nothing’ solution to one that is appropriately ‘deeply entangled with the context of use.’

II. Privacy concerns and obfuscation problems

Advances in street-level mapping have attracted privacy concerns across many jurisdictions. Google Street View, the earliest and most comprehensive of SLIPs,Footnote12 has received the most attention and criticism. Following its initial launch in the US in 2007, Google was confronted with a raft of complaints from individuals, government agencies, and advocacy groups objecting to the depiction of residential homes, vehicle registration plates and identifiable individuals.Footnote13 Such concerns prompted various inquiries by government agencies and regulators, sometimes resulting in temporary bans.Footnote14 Resistance on the basis of privacy concerns was particularly staunch in Germany, where a swell of complaints and regulatory actions led to Google’s complete cessation of Street View recording activities;Footnote15 a situation that persisted until recently.Footnote16

Legal grounds for objection and regulatory responses to Street View varied across jurisdictions. In the US, a spate of civil actions based on the application of different privacy torts were pursued unsuccessfully.Footnote17 In other parts of the world, regulators brought actions based on infringements of data protection law.Footnote18 Notwithstanding the activation of diverse types of privacy law, legal actions sprang from a common set of privacy concerns regarding the scale and novelty of Google Street View. Street View’s global data collection programme was enormous and initially conducted without any notice to, or permission from, either individuals or governments.Footnote19 While Street View’s collection of public street imagery was novel, due to its scale and ambition, the collection process itself involved variations of old practices and technologies which, in isolation, implicate privacy interests long contemplated by privacy laws, such as street photographyFootnote20 and public surveillance.Footnote21 However, the unprecedented scale of Street View,Footnote22 the sheer volume of imagery captured, and the scope of its availability rendered the perceived privacy impact of the project greater than that of its constituent processes. Greater also than that contemplated by the relevant privacy laws developed to govern those processes.Footnote23

The widespread criticism prompted Google to implement certain technical and organisational privacy protections into the product.Footnote24 Google’s main response to early objections involved image pixelation of collected Street View imagery to obfuscate images of certain objects, including faces and vehicle licence plates, that could give rise to recognised privacy law risks, particularly those involving data protection laws.Footnote25 Google’s initial deployment was also selective rather than comprehensive across the whole Street View project as it focussed on object obfuscation strategies in select jurisdictions to conform with local privacy laws.Footnote26 Google later acquiesced to further pressure from legislators, privacy regulators and individual advocates to adopt an automated, pre-emptive approach to the obfuscation of faces and vehicle licence plates across the globe.Footnote27

According to Google’s own description of its obfuscation measures in 2009, the new system was a ‘completely automatic system’ that could ‘sufficiently blur more than 89% of faces and 94 − 96% of license plates in evaluation sets sampled from Google Street View imagery.’Footnote28 The obfuscation approach involved a combination of noise and aggressive Gaussian blur which blended with surrounding background features to obfuscate targeted objects in images.Footnote29 The automated process of pre-emptive object obfuscation was not, however, failsafe. Google conceded that users were required to ‘narrow the gap between automatic performance and 100% recall’ by requiring complainants to self-report obfuscation errors in Street View imagery via a contact link provided in the live product.Footnote30 Jane Horvath, Google’s then Senior Privacy Counsel, summed up the shortcomings of the approach, stating that ‘our blurring technology is not perfect – we occasionally miss a face or license plate’ and ‘for the few that we miss, the tools within the product make it easy for users to report a face or license plate for extra blurring.’Footnote31 Instead of pre-emptive blurring, residential properties were blurred only after receipt of an individual privacy complaint or for national security purposes.Footnote32 Property obfuscation thus takes place post publication, after the threat to privacy has materialised.

This standard approach suffers from several weaknesses. We identify four different modes of automated obfuscation failure that have given rise to different types and levels of privacy harm ().

Table 1. Overview of object obfuscation by mode of failure.

The first type of obfuscation failure, false negatives, occurs where the relevant object, e.g. a person, property or number plate has not been sufficiently obfuscated to render the object unidentifiable, or has been missed in the obfuscation process. Such failures are most often the result of the identification algorithm not recognising a face or licence plate, such as, when a face is captured at a sharp angle or from a distance.Footnote33 In these instances, the face or the number plate is not recognised as an image that requires obfuscation. The absence of obfuscation in an environment where every other object of the same type is blurred, could be sufficient to generate a privacy claim based on reasonable expectations of privacy. The reasonable expectation is generated by the SLIPs object obfuscation itself and arises from its own failure. A comparable situation can arise in the situation of partial obfuscations, particularly of facial image objects, where the face is somewhat obfuscated and the obfuscation can be removed to thus reveal the individual’s face.Footnote34

The second type of failure, false positives, may occur where an object that was not required to be blurred has been obfuscated. Examples include blurring of faces on billboards,Footnote35 statues,Footnote36 as well as animal facial features where the identification algorithm identifies a non-human face as a human one.Footnote37 Though these instances may degrade the quality of the product, and reflect a failure of the overall process of automated object obfuscation blurring, they are unlikely to raise privacy concerns or compliance issues as they do not involve the revealment of individual identity in a data protection context or a private activity in the tortious context.

The third failure involves obfuscation that has been completed successfully, but paradoxically, the act of obfuscation draws attention to a person, property or other image object which is intended to be concealed by blurring. In this situation, obfuscation can add a ‘sense of suspicion to otherwise profoundly banal imagery.’Footnote38 The third type of failure is typically problematic with the addition of extra contextual information. For example, this form of obfuscation failure can be seen as a variation of the ‘Streisand Effect’,Footnote39 where an attempt to conceal a thing unintentionally confirms or heightens awareness of it. The ‘Streisand Effect’ itself derives from a legal action undertaken by the celebrity performer, Barbara Streisand, who brought a legal action against several parties involving the online publication of her residence. Prior to news of the legal action, only six attempts were made to view the property. In the first month following news of Streisand’s action, over 420,000 attempts were made to view the property online.Footnote40

A similar situation arose recently involving the obfuscation of US Supreme Court Justice Brett Kavanaugh’s home, following the leaked publication of the controversial Dobbs judgment.Footnote41 Google’s obfuscation of Kavanaugh’s home on Street View had the effect of drawing attention to the property as the only blurred façade on an otherwise non-blurred street.Footnote42 In this case, object obfuscation served to confirm that the house belonged to Kavanaugh.Footnote43 Both situations show that the obfuscation noise added has the perverse effect of drawing attention to the object that has been obfuscated, thus encouraging investigation into it. Obfuscation in this context is a signal to the curious rather than a noise that obscures.

The final type of failure encapsulates the central challenge that context poses to the efficacy of automated obfuscation processes. The blurring of faces and licence plates is not a definitive means of obfuscating the situational contexts in which privacy infringements can arise. Moreover, the nature and scale of Street View image collection means that identification of individuals based on environmental or other types of contextual elements, notwithstanding the blurring of faces and number plates, is a likely and common occurrence. As Teresa Scassa observes, ‘it is entirely possible to recognise individuals from attributes other than their faces — and their geographical location may combine with these attributes to reinforce the identification’.Footnote44 Accordingly, even though an individual’s face may be blurred, the combination of information about their location, vehicle model and home exterior may lead to identification from broader contextual and environmental factors.

The Canadian case of Pia Grillo v GoogleFootnote45 is an example of how context or ancillary objects may be sufficient to render an individual identifiable, even in situations of successful object obfuscation. In that case, the plaintiff brought a claim for invasion of privacy against Google for capturing and publishing an image of her in front of her Quebec home. Importantly for the decision, she was wearing a sleeveless tank-top with her breasts partially exposed.Footnote46 Though the plaintiff’s face had been blurred, the licence plate of her vehicle and the postal address of her home had not.Footnote47 The court held that the failure to blur information such as her licence plate and residential address may have led to personal identification.Footnote48

Context also shapes the scope and nature of the privacy interests held by the individual. As discussed below, courts have traditionally viewed activities carried out in certain places, such as the home, as attracting greater expectations of privacy. In the Grillo case, questions arose as to whether the plaintiff had tacitly waived her right to privacy because she was seated outside her home and thus visible from the street. The court rejected the argument, but it still highlights the pivotal role of context in shaping legal conceptions of privacy invasion.Footnote49

In sum, context is core to the establishment of privacy interests emanating from SLIP data collections in two ways. First, context may render an otherwise obfuscated individual identifiable. Second, the privacy rights and expectations of an individual will generally turn on the context of an activity. These two functions of context are reflected in data protection legislation and the law of privacy torts, respectively. A key limitation of automated obfuscation systems is the failure to adequately account for contextual nuances. However, these limitations are not merely technical. Delineating the reasonable expectations of privacy in a given context is a social and political process,Footnote50 which is the subject of ongoing contestation in courts, parliaments and public discourses.Footnote51 In the next section, we explore some of the complexities of legally conceptualising the context-sensitive nature of privacy interests implicated by SLIPs, which in turn have implications for designing technical measures for privacy protection.

III. Relevant legal frameworks

The act of capturing, aggregating, and disseminating street-level mapping imagery can implicate a range of privacy interests and rights protected by law. Capturing imagery by entering property without the permission of the owner may amount to trespass. The covert deployment of optical or data surveillance devices may breach surveillance device laws in some jurisdictions.Footnote52 This section focuses on two main areas of law which provided the basis for actions for invasion of privacy arising out of street-level mapping incidents: tortious invasions of privacy and data protection statutes. Despite a few instances of successful litigation and government pressure, a survey of the existing legal frameworks reveals a gap between public perceptions of privacy invasion and the interests protected by the law.

A. Privacy torts

Street-level mapping projects may give rise to claims for tortious invasion of privacy.Footnote53 Privacy torts are recognised in several common law jurisdictions.Footnote54 Two causes of action, intrusion upon seclusion and publicity of private life,Footnote55 are of relevance to SLIPs.Footnote56 The scope and elements of these torts have developed differently across jurisdictions. Nonetheless, it is possible to glean some common principles and themes which have emerged from attempts by courts in various jurisdictions to define what merits protection from intrusion or publication.

Intrusion upon seclusion involves intentional and unwanted intrusions upon the solitude or seclusion of a person.Footnote57 An intrusion may occur by way of physical entry, sensory or electronic observation, or search or inspection.Footnote58 In order to be actionable, the intrusion must be upon the intimate or private affairs of the person and be highly offensive to the hypothetical reasonable person.Footnote59 The second category, publicity given to private life, is concerned with the public disclosure of private information about a person.Footnote60 In some jurisdictions, it is a requirement that the intrusion or publication must be ‘highly offensive to the reasonable person’.Footnote61 Despite overlap,Footnote62 these categories are generally recognised by courts as separate torts.Footnote63 Broadly speaking, the former tends to concern physical access to, or observation of, a person, activity, or space. The latter involves the publication of private information about a person.

Whether the process and outputs of digital mapping are actionable will depend upon various contextual factors. Location often factors heavily into court assessments of what is ‘private’ and merits protection from intrusion or publication. Traditionally, expectations of privacy are highest in the home,Footnote64 and significantly diminished once they venture into public places.Footnote65 However, courts in various jurisdictions have shown a willingness to treat, as private, certain states or activities carried out in public places. US courts have taken the view that photographs taken of an individual as part of a public scene in their ‘ordinary status’ or involved in incidents ‘seen almost daily in ordinary life’ will not violate their privacy.Footnote66 Capturing a person in what courts considered an ‘embarrassing’ state, such as where private body parts are inadvertently exposed, might,Footnote67 though not where the publication is ‘newsworthy’.Footnote68 The Grillo case is one example of this, though the plaintiff’s location on private property may also have played a role in the finding of privacy interests. English courts have recognised images of a well-known person seeking treatment for drug addiction,Footnote69 and of the children of well-known parents in a public place, as private.Footnote70 The publication of details of an extra-marital relationship to a public audience,Footnote71 and images and details of sexual encounters,Footnote72 have also attracted privacy protection.

While these fact-sensitive decisions cannot be neatly reduced to general categories, it is safe to conclude that the location and state in which an individual is captured, and the nature of their activities, will factor heavily into whether mapping imagery capturing individuals outside of their homes is considered private.Footnote73 Generally, the incidental capture and publication of images of people going about quotidian activities in public places is unlikely to be protected. There are, however, few bright lines which distinguish an ordinary status or activity on one hand, from a private situation on the other.

The ‘highly offensive’ threshold adopted in some jurisdictions is a key obstacle to establishing that the capture or publication of mapping imagery constitutes a tortious invasion of privacy.Footnote74 To be highly offensive, courts may ask whether the act would cause a person of ‘ordinary sensibilities’ distress, humiliation, or anguish.Footnote75 In Boring v Google, a US court was not convinced that the act of entering a driveway accessible from a road marked ‘private road, no trespassing’ and photographing the plaintiff’s residence and swimming pool was highly offensive to a reasonable person.Footnote76 Accordingly, the plaintiffs’ claims of privacy invasion failed.

The way this information is collected has sometimes been material. For instance, the English courts have factored the surreptitious nature of information collection into decisions about invasion of privacy.Footnote77 As noted, the physical dimension of an invasion is recognised in some jurisdictions as a separate cause of action for intrusion into seclusion.Footnote78 Expectations regarding acceptable modes of collection are also context-dependent and evolving, as communities habituate to different practices over time. Take, for instance, the recent re-entry of Google Street View vehicles into Germany.

Overall, tort law has played a limited role in resolving the tensions triggered by street-level mapping technologies. With some exceptions, tort law’s adherence to the notion of ‘public-presence-as-consent’Footnote79 provides limited guidance for locating privacy interests beyond a restricted set of sites and activities. The limits of US privacy torts in addressing the privacy concerns which attended the initial rollout of Google Street View led several academics to argue for reform.Footnote80 However, common law courts have been given few opportunities to resolve perceived inadequacies in the law and thus no major legal developments have taken place.Footnote81 Instead, the ease in friction between community expectations and street-level mapping practices in the decade or so since Google Street View’s debut has largely been a product of compromise reached through changes to Google’s policies and the blurring and takedown mechanisms discussed above. Both were enacted in response to public outcry and pressure from regulators, levelled under the auspices of data protection law.Footnote82 As other authors point out, what distinguishes SLIPs from earlier systems of recording street imagery is the ubiquity of collection, the breadth of distribution,Footnote83 and the potential lifespan of the image – considerations more explicitly dealt with under statutory regimes for data collection, use and disclosure.

B. Data protection legislation

Tortious protections of privacy are predicated on legal mechanisms that establish spaces of non-intrusion. Such protections recognise a value of privacy that regards the ability of an individual to limit access to the private elements of their lives. The more one can limit access, the greater protection one has against potentially intrusive behaviour from others. The cases outlined above are emblematic of access-based conceptions of privacy by their focus on shielding a private context, whether it be a specific private space or property,Footnote84 or a private aspect of individual life, whether it be conducted in a private or a public space.Footnote85 The cases also recognise the prospects of amplification to a broader audience as an form of access intrusion, especially in situations that draw attention to physical features that would normally be classed as private.Footnote86

All these issues clearly give rise to legal risk management considerations for SLIPs which are built into technical solutions based on obfuscation of risk-generating captured imagery. The scale of global mapping platforms is such that managing the tortious risks that arise from intrusion type infringements is complex. However, mapping platforms need to manage this complexity in tandem with a different set of privacy requirements, namely, data protection, albeit from a perspective that is intended to apply in a limited, ‘common sense’ way.Footnote87 In doing so, the privacy focus shifts from the preservation of access to limit private intrusions to establishing and maintaining individual control of personal data, particularly imagery. A different form of privacy legal analysis is now at play that focuses more significantly on whether collected image data is classifiable as personal data. If that is the case, then data collectors, including the harnessers of public geographical images, may be subject to stringent data protection requirements.

The intention of data protection law is to establish processes of control for individuals regarding the handling of their personal information.Footnote88 A range of legal obligations are placed on data collectors, such as SLIPs, that begin at the point of data collection and end with destruction or de-identification of no longer required data.Footnote89 The guiding form of control mechanism are called ‘privacy principles’Footnote90 or in the US context, ‘fair information principles’,Footnote91 that govern data handling obligations for data collectors and provide a range of interaction points of involvement for individuals. In the interim, data collection organisations have a range of obligations to fulfil. The individual must be notified about the purposes of collection so they can meaningfully consent to subsequent uses.Footnote92 Personal data can generally only be used for a defined purpose about which the individual is adequately informed.Footnote93 Individuals have a range of interaction mechanisms that seek to ensure the maintenance of control by being able to affirm the accuracy and currency of collected personal information.Footnote94 Personal data, once collected and stored, must be kept secure.Footnote95

Data protection law thus seeks to provide individuals with varying degrees of control and involvement in personal information exchange processes. However, the application of data protection law whilst predicated on underlying notions of individual control,Footnote96 is also cognisant of the requirements of data collecting organisations and the flow-on benefits of personal information use for society.Footnote97 Balance between individual protections and organisational requirements is a key component of data protection law including SLIP collections. It is the balancing requirement, the balance between individual privacy infringement and broader societal benefit, that is at the heart of some jurisdictional ‘common sense’ approaches to SLIP data protection issues and others which have had more stringent regulatory perspectives.Footnote98 Some member countries of the EU have been the strictest.

The European Union model of data protection places greater rights-based protections for individuals because it is a fundamental right of EU citizenship.Footnote99 The US model of information privacy places greater emphasis on market-based activities and therefore provides a lesser degree of protection for individuals.Footnote100 The OECD based systems place greater emphasis on balancing interests to facilitate data exchange processes and any notion of rights-based protections exist in statutory rather than fundamental forms. These geo-political issues are of obvious importance to SLIPs because different jurisdictions will have different data protection expectations. As such, whilst core conceptual constructs are similar across the three systems, they are different in application which becomes problematic for SLIPs which operate across all three frameworks.

Nevertheless, the first question to consider is the same, namely, whether the image of an individual is classable as a type of information that can trigger data protection requirements. In this situation the basic question to ask is whether the image of an individual captured as part of SLIP image capture would be personal data or personal information, depending on the jurisdictional context.Footnote101 It is useful to note, at this point, the different legal approaches compared to privacy torts. A key consideration in relation to privacy torts regarded whether a realm of privateness had been captured and whether an intrusion into the private was likely to infringe a reasonable expectation of privacy. Both are foundational issues in tort that are much less relevant in data protection. The public/private distinction is not a requirement of data protection obligations and reasonable expectations is not the foundational legal test. Consequently, images captured in public spaces under a data protection perspective can give rise to legal obligations on account of the identifiability of an individual rather than an intrusion into an individual’s private life, as noted above.Footnote102 The question of whether SLIP image collection is personal data or information, including sensitive information, is thus a threshold test.Footnote103

The classificatory basis of regulated information is therefore important because it belies many of the political considerations inherent to the application of data protection law. The US situation, in comparison to the EU and Australia, is an important case in point. Each jurisdiction has a different method of classification which has an impact on the scope of application, including for SLIPs. For example, both EU and OECD jurisdictions have an intentionally flexible definition of personal data or personal information that regards information relating toFootnote104 or aboutFootnote105 an identified or reasonably identifiable individual. The use of ‘relating to’ and ‘about’ define the connective process that links an individual to a recognisable and applicable process of identity.Footnote106 Two states of identity then flow, namely, identified and identifiable, or reasonably identifiable, in the Australian context. There is general agreement about how the two states of identity are conceptualised in both jurisdictions which entail context independent and context dependent approaches.Footnote107

A context independent approach enables the categorisation of personal information without recourse to the social context within which the information is used. It represents the ‘identified’ state of both definitions. The removal of social context simplifies the categorisation of personal information because it is possible to make a definitive prediction of what information is always likely to be classified as personal information. For example, a name is always likely to reveal identification so therefore a name will always be personal data or information. Similarly, a photograph of an individual who is clearly recognisable from the image will be personal data or information. If that is the case, then collections under some jurisdictions, may require the individual to consent to image collection before it takes place.Footnote108 This of course is an extremely difficult task for SLIP image collections which are indiscriminate by nature. The issue of consent acquisition is obviated by object obfuscation of faces which makes the individual unidentifiable and post-publication reporting options when automated errors occur.

However, that is not the end of the story. A context dependent approach, as defined by the ‘identifiable’ or ‘reasonably identifiable’ state, deems that personal information can only be identified by examining the social context within which a piece of information is used.Footnote109 This makes definitive prediction virtually impossible because all information could be classed as personal information in certain circumstances, which are likely to be inherently subjective. For example, as above, a home address does not automatically reveal identity, but it can do in certain circumstances by cross-referencing with other information.Footnote110 The same can also be said for SLIP image collections as exemplified by the Pia Grillo decision highlighted above. Even though the plaintiff was not facially identifiable, the court decided that her identity could be ascertained in relation to her location and other bodily factors. Object obfuscation of faces, if done completely, unlike the examples of false positives highlighted in Section II, might be an automated solution for identified types of imagery that could be personal data or information, but is not a guaranteed solution to the more complex context dependent types of identifiability.

Most jurisdictions also recognise that different forms of personal information can have greater levels of sensitivity attached to it, such as, data about racial origins, political or philosophical beliefs, sexual orientation or biometric data.Footnote111 The issue of whether SLIP facial image data collection is biometric data and therefore sensitive is not of direct relevance to our paper. However, it is important to consider recent regulatory actions against Clearview AI which appears to show that the type of collections pertinent to SLIPs are increasingly being considered in relation to the scale and application. The action brought by the Office of the Australian Information Commissioner, in conjunction with the UK’s Information Commissioner’s Office is a case in point.Footnote112 Clearview AI’s business model was to provide a facial recognition tool, principally for law enforcement agencies throughout the world.Footnote113 To do so, it required a massive database of facial imagery which Clearview AI achieved through web scraping activities.Footnote114 Much like the initial advent of Google Street View, Clearview AI’s business gave rise to several regulatory investigations based on the application of data protection law.

A key procedural question in the Australian investigation involved whether Clearview AI was carrying out a business in Australia under s5B(3)(b) of the Privacy Act 1988 (Cth). The Commissioner found that Clearview AI was carrying out business in Australia when it employed its web crawling technology that scraped Australian websites.Footnote115 More importantly for this paper, the Commissioner also made clear that Clearview’s scraping was of an ‘indiscriminate nature’ and the scale of its database which contained ‘at least 3 billion images’ meant that it must have collected Australian facial imagery for services intended for the domestic market.Footnote116 Moreover, because of Clearview’s use of imagery as biometric profiling, it could not rely on forms of implied consent, when explicit consent was required, especially relating to collections where individuals were not adequately informed.Footnote117 The indiscriminate nature of collection, combined with the sensitivity of use, meant that Clearview’s opt-out mechanism was not sufficient especially when balanced against the ‘serious consequences for the individual’.Footnote118

The Clearview decision, while not directly relevant to our SLIP privacy considerations, is still helpful because it demonstrates the type of balancing exercise that regulators and courts engage in when considering massive and indiscriminate types of image data collection. Individuals are unaware of data collections in both types of situation and cannot provide meaningful or valid consent. As a result, individuals do not have the types of control over personal data that data protection laws seek to instil. Moreover, even though Clearview involved sensitive information for biometric use, which is beyond the scope of our paper, it is nonetheless important because it highlights that the combination of indiscriminate collections to produce massive facial image databases will give rise to specific types of adjudicatory reasoning. Scale is becoming important and it is therefore an issue that SLIPs need to be more attuned to, given their size and indiscriminate forms of collection. With that in mind, we now set out a different way of thinking about automated privacy protection for SLIP data collections. Our new approach moves exclusively from object obfuscation, as the only regulatory solution, to consider a more holistic understanding of privacy concepts, in tandem with more sophisticated forms of automated response involving contextually-dependent identification.

IV. Enhancing automated privacy protection

The above analysis reveals the complex privacy considerations at play regarding SLIP data collections and usages. The indiscriminate nature of SLIP image collections, and the equally indiscriminate process of object obfuscation as a general privacy protection, gives rise to questions about whether SLIPs are doing enough to sufficiently protect privacy. To think about this sufficiency question, we go back to Brunton and Nissenbaum’s two contextual connotations, namely, that acts of obfuscation must be considered within the context of power asymmetries and that privacy is a ‘multi-faceted concept’ with a range of different ways in which it can be protected. We finish our substantive considerations by examining technical alternatives to ‘better-than-nothing’ protective processes of SLIP object obfuscation. In doing so, we ask whether it is possible to adopt processes of automated contextually-dependent identification that is more attuned to the complex privacy scenarios identified in this paper. To do so requires machine learning processes to holistically identify individual and environmental contexts arising from the principled analysis of tortious and data protection laws across four areas of potential object obfuscation failure. They are where:

  1. Obfuscation failed at the individual level e.g. where a person’s image was ineffectively pixelated and a person was re-identified despite the obfuscation applied to facial features (a false negative example of facial obfuscation).

  2. Obfuscation failed at the environmental level e.g. where a property or some other thing was ineffectively pixelated and a person was re-identified despite the obfuscation applied to the property or thing (a failure to obfuscate a licence plate or a home name plaque).

  3. Obfuscation did not fail at the individual level e.g. but nevertheless the un-pixelated environment still leads to the identification of the individual (the type of contextual identification in the Pia Grillo case).

  4. Obfuscation did not fail at the environmental level e.g. but nevertheless the pixelation still draws attention to an individual in combination with other data (‘the Streisand Effect’).

Incorporating these four areas of potential obfuscation failure gives rise to new complexities for automated forms of analysis that require both individual and environmental contexts to be assessed. As noted immediately below, while it is technically possible to identify context from images it remains a non-trivial task particularly when conducted at the level of scale in which SLIPs operate. We therefore contend that further research into automated contextually-dependent privacy detection is required to ensure the true complexity of privacy risks are sufficiently contemplated in SLIP processes of ameliorative protection.

However, whilst the development of new processes of automated contextually-dependent privacy protection would provide a more sufficient technical process to better handle the ‘multi-faceted’ elements of privacy law, it does not alone assist with Brunton and Nissenbaum’s observation that obfuscation processes must be considered within the context of power asymmetries. We consequently also consider new types of legal risk management process that could be adopted based on existing processes of privacy-by-design that are more cognisant of the power contexts relevant to SLIPs. Key to the power context is the scale of SLIPs and thus their ability to establish privacy expectations of the broader populous. We therefore conclude the paper by examining the question of whether automated obfuscation is a sufficient protection considering the technical enhancements possible and in conjunction with the scale, indiscriminate nature and power of SLIP activities.

A. Contextually-dependent automated and semi-automated privacy detection

As noted above, the current approach to privacy detection and intervention tends towards literal interpretations of privacy-sensitive features. Objects that are understood to contain personal data or information, such as faces or number plates, are detected through algorithms, bounded and blurred. However, context-dependent detection and intervention requires more sophisticated scene understanding. In this circumstance, an object may constitute personal information only when accompanied by specific contextual elements. The existence of features which are context dependent, partially obscured or outside of recognised ‘privacy sensitive’ categories are often not captured by existing methods. Computer vision systems optimise for such tasks, by using methods such as, object detection, instance segmentation and semantic segmentation. However, they fail to address scene dependent privacy issues as they falsely assume a binary categorisation of objects as either private or non-private.Footnote119 As noted above, whilst binary differentiation may be used to enhance technical functionality, the idea that there is a distinct public and private sphere which can be used to delineate legal protections is seen as increasingly problematic.Footnote120 Accordingly, there is a need to develop context-dependent object detection methods that require more nuanced privacy protections based on a deeper contextual understanding of privacy torts and data protection law.

The technical components of a more sophisticated contextual analysis already exist but are complex, particularly in operation at scale. There are continuous grading options, but they assume a type of monotonic scale that can have the tendency to oversimplify context. For example, object detection is an important task for autonomous driving systems because it allows vehicles to actively detect roads, pedestrians, and other vehicles.Footnote121 The existing approach to object detection is underpinned by the development of deep learning and neural networks.Footnote122 It combines the two tasks of object localisation and image classification to determine where an object is within an image and assign it a class label.Footnote123 The complexity of traffic scenarios and real-time decision making also requires instance and semantic segmentation.Footnote124 There will often be more than one vehicle within an image frame which will need to be detected, and the exact location must be known. These actions require the detection of individual object instances, pixel-by-pixel segmentation and assignment of class labels.Footnote125

The three most common methods for object detection are R-CNN (’Region-Based Convolutional Neural Network’),Footnote126 YOLO (’You Only Look Once’)Footnote127 and SSD (’Single Shot Detection’).Footnote128 R-CNN models make use of a two-step process, proposing multiple regions within an image and using a CNN to classify each of the proposed regions.Footnote129 This has been improved by Faster R-CNN, which integrates Region Proposal Networks (RPNs) to effectively combine the two steps while maintaining high accuracy.Footnote130 In contrast, YOLO is an end-to-end method which offers faster performance by predicting bounding boxes and fix class labels in one simultaneous forward pass.Footnote131 Similarly, SSDs balance speed and detection accuracy by avoiding a delegated region proposal network, instead predicting bounding boxes and class categorisations directly from feature maps in one single pass.Footnote132 The choice of object detection method for an autonomous driving system is consequently a trade-off between speed, detection accuracy and detection granularity. However, while these methods improve object detection, none of them make use of contextual features beyond individual grids of region proposals. Contextuality is improved by having a better, closer to real-time understanding of specific objects which is obviously a limitation when considering the more complex forms of context-dependent analysis we argue are needed above.

The concept of ‘context’ in computer vision includes any information that could influence how a scene and its objects are perceived.Footnote133 This often includes local pixel information which assists in image segmentation and boundary extraction.Footnote134 For example, Dalal and Triggs found that incorporating background information into detection of pedestrians improved accuracy.Footnote135 However, contemporary object detection methods have progressed beyond these techniques to achieve more complete scene understanding. Enlarging the receptive field of object detection networks has been shown to improve overall accuracy.Footnote136 The advent of transformers and attention-based mechanisms has allowed detection techniques which utilise full-image receptive fields.Footnote137 This enables scene configuration to be exploited as an additional source of information for object detection.Footnote138

In particular, an objects presence, appearance and location within an image can be more accurately detected with scene understanding.Footnote139 The likelihood of an object’s presence can be predicted using the environment, the presence of other objects and the scene layout.Footnote140 Again, this development is particularly significant for autonomous driving systems because it has increased accuracy by identifying objects that are not on the road.Footnote141 The extraction of context features was categorised by Divvala and others into three distinct categories; semantic, scale and spatial context.Footnote142 Semantic context describes the likelihood of an object appearing in a particular type of scene. Scale context describes the relative size of an object to surrounding objects, and spatial context considers the objects which are likely to surround a particular object.Footnote143 These attempts to move towards greater scene understanding and beyond literal detection will increase the accuracy of computer vision systems. However, increasing accuracy, as noted above, will also necessitate more effective privacy-focused intervention techniques, particularly obfuscation.Footnote144

As noted above, the most common methods of image obfuscation are pixelation and blurring.Footnote145 Pixelation, or mosaicking, obfuscates part of an image by dividing the targeted area into a square grid and computing the average colour of each pixel within the grid.Footnote146 The entire square (aka, ‘pixel box’) is set to that colour, with the size of the boxes controlling the granularity of the pixelation.Footnote147 Blurring removes detail from an image by applying a Gaussian kernel which smooths the targeted area.Footnote148 Section II outlines that object obfuscation techniques are not a silver bullet as they do not remove all of the information from an image, but limit a human’s ability to interpret the obfuscated information. The preservation of some information, though more visually pleasing than a complete redaction, is fallible to extraction from advanced image recognition models,Footnote149 particularly in relation to contexts that can give rise to identification.

Two common non-standard methods of obfuscation are k-same and GAN obfuscation.Footnote150 K-same obfuscation has been used for facial blurring by clustering faces based on non-identifiable information such as expression, then generating a surrogate face for each cluster.Footnote151 This ensures that a re-identification system cannot be more accurate than 1/k in assigning a face to an individual.Footnote152 GAN methods popularised by ‘deep fakes’ can provide more realistic faces whilst still removing identifiable information. These models use contrastive loss and conditional-GANs to ensure faces are obscured from the source input.Footnote153 In addition, other more esoteric methods of image obfuscation have also been explored. For example, black boxes,Footnote154 cartooning,Footnote155 full body masking,Footnote156 and inpainting.Footnote157 These methods each seek to reach an optimal balance of obscuring human recognition, obscuring machine recognition, and visual attractiveness.

The outlined obfuscation methods all make object detection more difficult. For example, pixelation can remove edge details which may be critical for object localisation, whilst blurring may diffuse contrast features necessary for object classification. However, autonomous driving systems are increasingly trained on datasets which already have sensitive information obscured.Footnote158 Such systems are inherently more robust in detecting and processing obfuscated imagery.

Furthermore, deep learning techniques have been developed which are able to circumvent these methods to access the underlying information. Tekli and others described techniques which identify obscured information through either ‘Recognition-based’ or ‘Restoration-based’ attacks.Footnote159 Recognition-based algorithms are trained to recognise information from within obscured images, whilst Restoration-based methods attempt to reconstruct the original features that have been obscured.Footnote160 The introduction of automated or semi-automated context dependent detection of privacy features may introduce a third class; one which negates the requirement for recognition or restoration, rather using extraneous information within an image to infer obscured detail. This evolving interplay between obfuscation methods and detection techniques underscores the requirement to regularly revisit the need for nuanced privacy considerations regarding computer vision technologies.

The above analysis of technical developments highlights that the type and level of object obfuscation employed by SLIPs could be augmented. First, it is possible, by applying techniques developed for autonomous vehicles, to still focus on object identification but to do so in a more concentrated fashion that better identifies captured image objects and potentially the relationships between objects. Second, obfuscation methods can also be employed with different types of object detection technique to provide a closer identification of certain objects to be obfuscated (e.g. faces) and to provide enhanced obfuscation methods that are more comprehensive in coverage and more difficult to re-engineer. These developments highlight that there are possibilities for improving object obfuscation which would better understand and respond to the image environment of captured objects.

The two improvements highlighted above would increase the scope of contextually-dependent forms of identification, but they would not provide automated processes where potentially privacy infringing contexts could be identified or detected. Accurately detecting the types of privacy infringing context highlighted above through machine learning for legal risk management purposes still appears to be some distance away, despite the improvements to object obfuscation that could be made. This leads us back to Brunton and Nissenbaum’s second context connotation and the need for SLIPs to provide more contextually-dependent risk management processes that better understands the broader context of power asymmetries and the establishment of expectations involved in privacy-by-design processes.

B. Contextually-dependent privacy by design

In an effort to grapple with the threats to privacy posed by the types of ubiquitous digital data collection characterised by SLIPs, legislators have increasingly turned to a ‘privacy-by-design’ paradigm.Footnote161 The foundations of privacy-by-design were laid several decades ago, most notably in the work of former Information and Privacy Commissioner for Ontario, Ann Cavoukian,Footnote162 and embraced by policymakers across many jurisdictions.Footnote163 The principle gained a new statutory emphasis and articulation with the adoption of the General Data Protection Regulation. Article 25 of the Regulation requires that firms:

 … implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.Footnote164

In essence, the provision broadens regulatory focus beyond individual rights and a post facto remedy to systemic, ex ante and lifecycle protections, by requiring organisations to build technical and organisational safeguards into system architecture that ensures compliance with data protection obligations and rights.

While Article 25 represents the clearest legislative articulation of a privacy-by-design obligation to date, it is nonetheless ambiguous in scope and operation.Footnote165 The Regulation supplies a few examples of the types of technical and organisational measures which might be required but refrains from further detail, leaving significant room for interpretation as to the concrete requirements in a given context. Indeed, the context-dependent nature of the obligation is explicit in the provision.Footnote166 What constitutes an ‘appropriate’ technical and organisational measure and a ‘necessary’ safeguard is, according to Article 25, dependent upon a range of factors, including the context of data processing and the severity of the privacy risk it poses. As noted in Section III, and throughout the paper, context is central to legal conceptions of privacy invasion across jurisdictions and bodies of law. Identifying the features of a context that heighten or ameliorate the risk of privacy invasion is also a notoriously difficult and an ongoing social and legal project. Another factor which must be considered in determining appropriate technical measures under article 25 is the ‘state of the art’. As Bygrave observes, the provision assumes the existence or emergence of growing markets for privacy-enhancing technologies (PETs), propelling the advancements of the ‘state of the art’.Footnote167 In reality, however, the innovation and uptake of PETs across various domains has stagnated;Footnote168 a situation we observe in the street-level mapping domain.

We argue that while the largest SLIPs have long implemented technical safeguards for privacy into their products, the approach taken by key members of the industry falls short of privacy-by-design ideals in some key respects. The implementation of technical and organisational safeguards in major SLIPs was largely a reactive process, ‘bolted on as an add on, after the fact … ’,Footnote169 rather than embedded into the architecture of a system.Footnote170 Further, the initial build and engineering of SLIPs relied upon indiscriminate data collection, as opposed to data minimisation, which Seda Gurses and others rightly cast as ‘necessary and foundational first step to engineer systems in line with the principles of privacy by design.’Footnote171 Moreover, we argue that the standard approach of privacy-by-design and its predominant focus on data protection, is insufficient for SLIPs because it fails to sufficiently account for the contextual nature of privacy interests. A more contextually-dependent form of privacy-by-design is required that can consider Brunton and Nissenbaum’s ‘multi-faceted’ nature of privacy and the different types of legal issue that could arise.

Section III highlighted the different legal frameworks of tortious privacy and data protection that are implicated in SLIP data collections and usages. Both frameworks provide core legal protections and do so in separate ways with different foci. Privacy torts engender a focus on reasonable expectations of privacy predicated historically on a clear distinction between private and public realms. The traditionally (highly contested) distinctions between private and public has of course been further disrupted by novel technologies such as SLIPs. Nevertheless, that distinction is still a prime theme that emerges from some of the key cases, such as Boring, which still seek to differentiate reasonable expectations of privacy on differing privacy contexts, especially those related to the private realm. Data protection law focuses on providing designated levels of assurance and control for individuals to ensure that data collectors handle personal data in accordance with a range of legal boundaries including legal authority, contractual obligations and consent. The law’s guiding focus is the balancing of individual control with the organisational exigencies of data collectors. Privacy-by-design methods that focus exclusively on data protection, such as Article 25 of the GDPR, albeit with scope for implementation and development, do not fully encapsulate the requirement for a reasonable expectations analysis that is an essential part of SLIP pre-emptive privacy considerations.

We contend that a reasonable expectation analysis does not simply entail the layering of one legal framework over another as part of a data protection driven privacy-by-design bolt on.Footnote172 Rather, our inspiration for the type of deeper, contextually-dependent analysis is again drawn from Brunton and Nissenbaum’s consideration of power asymmetries. Brunton and Nissenbaum rightly contend that there is a foundational link between obfuscation as a privacy strategy and its use as an ameliorator of power.Footnote173 In that sense, throughout their work, obfuscation is considered as a tool of the repressed to equalise power asymmetries in online environments.

Our use of their obfuscation construct has been different throughout our paper. We have used obfuscation techniques as a point of critique of the powerful rather than a tool for the powerless. We consider obfuscation as a method for organisations to build privacy into data collection systems by design rather than as a form of deliberate resistance to surveillance and data collection.Footnote174 In this regard, the noise that obfuscation strategies produce is not an individual protection. Rather, it is representative of an exercise of power that seeks to enshrine a normative understanding of reasonable expectations of privacy. In the case of SLIPs, one that is based on the ‘better-than-nothing’ provision of automated object obfuscation with the dubious fallback of online individual reporting of published obfuscation errors or contextual complexities that are beyond machine learning techniques.

We highlight above the scope of Article 25 and its inherent contextual focus on ‘appropriate’ technical and organisational measures that must consider the ‘state of the art.’ The immediate sub-section above highlights that existing identification and obfuscation techniques exist that could potentially be used to further develop a truly ‘state of the art’ automated system for object obfuscation that would better marry the systemic concerns which both privacy torts and data protection hark to. However, while the development of improved technical forms of object obfuscation to SLIPs could provide enhancements, it would still not deal with the more complex question of contextual identification of potential privacy infringing imagery that is undertaken post collection and publication. The question therefore arises whether SLIPs should be required to implement a broader contextually-dependent type of privacy-by-design that considers ‘state of the art’ technical measures and whether they are appropriate for the publication of global street level imagery.

We contend they should, given the indiscriminate nature of SLIP data collections, the unprecedented global scale at which data collection takes place, often without the knowledge and the express consent of individuals who are inadvertently captured by image collection. The combination of contextually-dependent automated privacy detection techniques with contextually-dependent privacy-by-design legal risk management processes will thus shift obfuscation from a ‘humble, provisional, better-than-nothing’Footnote175 solution to one that is appropriately ‘deeply entangled with the context of use’Footnote176 and thus gives fuller effect to Brunton and Nissenbaum warnings about the relationship between obfuscation and power.

V. Conclusion

Our paper highlights the contextual complexities inherent in SLIP data collections and publications. The indiscriminate nature of SLIP data collections is partially offset by an equally blunt protective privacy measure, object obfuscation. Common objects that are obfuscated, based on prior legal or regulatory actions, include facial features and car licence registration number plates. Nevertheless, we identify four types of obfuscation failure that emerge from SLIPs. They are false negatives, false positives, the Streisand Effect and contextual identification. The four failures demonstrate the complex legal requirements that emerge for SLIPs which emanate from both privacy torts and data protection law. The complex legal issues that arise are due to the ‘multi-faceted nature’ of privacy in which object obfuscation is but one tool in a much larger tool-box. However, SLIPs tend to portray object obfuscation as the key tool available which in turn attempts to shape broader expectations about the level of privacy protection that is attainable. Based on the work of Brunton and Nissenbaum, we contend that contextually-dependent privacy detection and privacy-by-design processes are required in SLIPs to ensure generated privacy expectations are indeed reasonable and are in keeping with the types of appropriate, state of the art technical measures necessary to safeguard against the contextual risks that emanate from SLIP data collections.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This research was partially supported by funding from Australian Research Council Laureate Fellowship FL210100156.

Notes

1 Finn Brunton and Helen Nissenbaum, Obfuscation: A User's Guide for Privacy and Protest (The MIT Press, 2015), 95. As noted below, while we draw on Bruton and Nissenbaum’s concept of obfuscation, we nevertheless use it in a different context. We consider obfuscation as a method for organisations to build privacy into data collection systems by design rather than as a form of deliberate resistance to surveillance and data collection, as described by Brunton and Nissenbaum. The use is justified because Bruton and Nissenbaum’s and our considerations have a power related context that is different in application to SLIPs.

2 For example, Apple, ‘Apple Look Around’ (Maps, 2023) <https://www.apple.com/au/maps/> 4 September 2023; Mapillary, ‘Make Better Maps’ (2023) <https://www.mapillary.com/> accessed 4 September 2023; KartaView, ‘Easy Mapping’ (2023) <https://kartaview.org/landing> accessed 4 September 2023; and Microsoft, ‘Bing Streetside’ (Streetside View, 2023) <https://www.microsoft.com/en-us/maps/bing-maps/streetside> accessed 4 September 2023.

3 Lauren Rakower, ‘Blurred line: Zooming in on Google Street View and the Global Right to Privacy’ (2011) 37 Brook. J. Int'l L 317.

4 Alternative methods include crowd-sourcing street imagery from voluntary user submissions, for example, as employed by street-level imagery app Mappilary.

5 Josh Blackman, ‘Omniveillance, Google, Privacy in Public, and the Right to Your Digital Identity: A Tort for Recording and Disseminating an Individual's Image Over the Internet’ (2009) 49 Santa Clara L. Rev. 313.

6 Roger Geissler, ‘Private Eyes Watching You: Google Street View and the Right to an Inviolate Personality’ (2012) 63 Hastings L. J. 897.

7 Sarah Elwood and Agnieszka Leszczynski, ‘Privacy, Reconsidered: New Representations, Data Practices, and the Geoweb’ (2011) 42 Geoforum 6, 10.

8 Jana McGowen, ‘Your Boring Life, Now Available Online: Analyzing Google Street View and the Right to Privacy’ (2010) 16 Tex. Wesleyan L. Rev. 477.

9 Brunton and Nissenbaum (n 1) 95.

10 Ibid 56.

11 Ibid 45.

12 Introduced first in the United States, Google has since expanded its product to cover all seven continents and over 100 countries and territories. See Rakower (n 3) 323.

13 Blackman (n 5).

14 The use of temporary bans was particularly prominent in Europe. For example, Greece, Lithuania, the Czech Republic and Austria all initiated regulatory action against Street View. See, for example, Helena Smith, ‘Google Street View Banned from Greece’ The Guardian (Athens) 13 May 2009) <https://www.theguardian.com/technology/2009/may/12/google-street-view-banned-greece> accessed 4 September 2023. See also Geissler (n 6) 899.

15 Ingrid Hoelzl and Rémi Marie, ‘Google Street View: Navigating the Operative Image’ (2014) 29 Vis. Stud. 261, 268.

16 Aggi Cantrill and Stephanie Bodoni, ‘Google Street View to Post First New Pictures of Germany in a Decade’ Bloomberg Technology (America, 26 July 2023) <https://www.bloomberg.com/news/articles/2023-07-25/google-street-view-to-post-first-new-german-pictures-in-a-decade> accessed 4 September 2023.

17 Lindsey Strachan, ‘Re-mapping Privacy Law: How the Google Maps Scandal Requires Tort Law Reform’ (2011) 17(4) Rich. J. L. & Tech 1.

18 Ira Rubinstein and Nathaniel Good, ‘Privacy by Design: A Counterfactual Analysis of Google and Facebook Privacy Incidents’ (2013) 28 Berkeley Tech. L. J. 1333, 1384.

19 The first round of Street View collections was undertaken without any prior notification to communities. This has subsequently changed and Google now publishes online details of Street View capture. See Google, ‘Discover When, Where, and How, We Collect 360 Imagery’ (2023) <https://www.google.com.au/streetview/how-it-works/> accessed 4 September 2023.

20 Claudia Cuador, ‘From Street Photography to Face Recognition: Distinguishing Between the Right to be Seen and the Right to be Recognized’ (2017) 41 Nova L. Rev. 237.

21 Neil Richards, ‘The Dangers of Surveillance’ (2013) 126 Har. L. Rev. 1934.

22 Google reported in May 2022 that it comprises over 220 billion individual images. See Lauren Forristal, ‘Google Maps’ Street View Celebrates 15 Years with Historical Imagery on Mobile, New Camera and More’ TechCrunch (America, 25 May 2002) <https://techcrunch.com/2022/05/24/google-maps-street-view-celebrates-15-years-with-historical-imagery-on-mobile-new-camera-and-more/> accessed 4 September 2023.

23 James Thornton, ‘Individual Privacy Rights with Respect to Services such as Google Street View’ (2010) 40 Computers & Society 70.

24 Geissler (n 6); Rubinstein (n 18) 1384. Rubinstein and others note that Google’s ex ante approach to potential harm was a deliberate strategy.

25 Hoelzl and Marie (n 15) 268. For example, the above-mentioned German resistance led Google to sign a binding memorandum of understanding with the German Data Protection Agency in which they agreed, inter alia, to blur faces and licence plates before publication and provide residents the ability to have their homes removed or blurred.

26 Siva Vaidhyanathan, The Googlization of Everything:(and Why We Should Worry) (University of California Press, 2012) 101.

27 Andrea Frome and others, ‘Large-scale Privacy Protection in Google Street View (IEEE 12th International Conference on Computer Vision, Kyoto, 29 September 2009) <https://ieeexplore.ieee.org/document/5459413> accessed 4 September 2023.

28 Ibid 2378.

29 Ibid 2380.

30 Ibid.

31 Darren Waters, ‘Google’s Street View Response’ BBC News UK (London 7 July 2008) <https://www.bbc.co.uk/blogs/technology/2008/07/googles_street_view_response.html> accessed 4 September 2023.

32 Alan Weedon, ‘Why Large Swathes of Countries are Censored on Google Maps’ ABC News (Australia 21 February 2019) <https://www.abc.net.au/news/2019-02-21/why-large-parts-of-earth-are-censored-by-google-maps/10826024> accessed 4 September 2019. See also Patrick Gallo and Houssain Kettani, ‘On Privacy Issues with Google Street View’ (2020) 65 S. D. L. Rev 608, 612.

33 Frome and others (n 27).

34 Kaiyu Yang and others, ‘A Study of Face Obfuscation in ImageNet’ (10 March 2021) Arxiv <doi:10.48550/arxiv.2103.06191> accessed 4 September 2023.

35 Geissler (n 6) 994 referring to the obfuscation of Colonel Sanders on KFC billboards throughout the UK.

36 Harriet Mallinson, ‘Google Maps Street View’s Big Photo Mistake Exposed – Can You Spot it?’ The Express (London 25 April 2020) <https://www.express.co.uk/travel/articles/1271378/google-maps-street-view-privacy-statue-face-blur-funny-photo> accessed 4 September 2023.

37 BBC News, ‘Google Street View Blurs Bullocks Face in Cambridge’ BBC Cambridgeshire (London 15 September 2016) <https://www.bbc.com/news/uk-england-cambridgeshire-37378007> accessed 4 September 2023.

38 Hoelzl (n 15) 263.

39 Mario Cacciottolo, ‘The Streisand Effect: When Censorship Backfires’ BBC News UK (London 15 June 2012) <https://www.bbc.com/news/uk-18458567> accessed 4 September 2023.

40 Sue Curry Jansen and Brian Martin, 'The Streisand Effect and Censorship Backfire' (2015) 9 Int. J. Commun. 656.

41 Ibid.

42 Mikael Thalen, ‘Google Maps Thrust into Fight Over Roe v. Wade in Wake of Protests at Brett Kavanaugh’s House’ Daily Dot (America May 11 2022) <https://www.dailydot.com/debug/google-maps-blurring-supreme-court-justice-home-protest> accessed 4 September 2023.

43 Mikael Thalen, ‘Google Maps Thrust into Fight Over Roe v Wade in Wake of Protests at Brett Kavanaugh’s House’ VisionViral.com (10 May 2022) <https://visionviral.com/google-maps-thrust-into-fight-over-roe-v-wade-in-wake-of-protests-at-brett-kavanaughs-house/> accessed 4 September 2023.

44 Teresa Scassa, ‘Geographic Information as Personal Information’ (2010) 10 OUCLJ 208.

45 Pia Grillo v Google Inc. (2014) QCCQ 9394 (Can.).

46 Ibid [11].

47 Ibid [56].

48 Ibid [57]. Note also that Google admitted obfuscation failure of the licence plate which could be classed as a false negative error. Thus, re-emphasising our point that there are crossovers between types of failure.

49 The court also rejected the argument that the plaintiff had tacitly waived her right to privacy because she was seated outside her home and thus visible from the street. See Ibid [49]–[51].

50 Helen Nissenbaum, ‘Contextual Integrity Up and Down the Data Food Chain’ (2019) 20 Theo Inq L 221.

51 Deirdre Mulligan and others, ‘Privacy is an Essentially Contested Concept: A Multi-Dimensional Analytic for Mapping Privacy’ (2016) 374 Philos. Trans. R. Soc. A 1.

52 Surveillance Devices Act 2007 (NSW), ‘Surveillance Devices Act 2007 No 64’ (Legislation) 16 May 2022 <https://legislation.nsw.gov.au/view/html/inforce/current/act-2007-064> accessed 1 February 2024; Invasion of Privacy Act 1971 (Qld), ‘Invasion of Privacy Act 1971’ (Legislation) 5 June 2017 <https://www.legislation.qld.gov.au/view/html/inforce/current/act-1971-050> accessed 1 February 2024; Listening and Surveillance Devices Act 1972 (SA), ‘Listening and Surveillance Devices Act 1972’ (Legislation) 4 September 2017 <https://www.legislation.sa.gov.au/__legislation/lz/c/a/listening%20and%20surveillance%20devices%20act%201972/2017.12.17/1972.112.auth.pdf> accessed 1 February 2024; Listening Devices Act 1991 (Tas), ‘Listening Devices Act 1991’ (Legislation) 5 October 2018 <https://www.legislation.tas.gov.au/view/html/inforce/current/act-1991-021> accessed 1 February 2024; Surveillance Devices Act 1999 (Vic),‘Surveillance Devices Act 1999’ (Legislation) 1 December 2021 <https://content.legislation.vic.gov.au/sites/default/files/2021-12/99-21aa042%20authorised.pdf> accessed 1 February 2024; Surveillance Devices Act 1998 (WA), ‘Surveillance Devices Act 1998’ (Legislation) 5 April 2023 <https://www.legislation.wa.gov.au/legislation/statutes.nsf/law_a1919.html> accessed 1 February 2024; Listening Devices Act 1992 (ACT), ‘Listening Devices Act 1992’ (Legislation) 11 February 2022 <https://www.legislation.act.gov.au/View/a/1992-57/current/html/1992-57.html> accessed 1 February 2024; Surveillance Devices Act 2007 (NT), ‘Surveillance Devices Act 2007’ (Legislation) 30 November 2018 <https://legislation.nt.gov.au/en/Legislation/SURVEILLANCE-DEVICES-ACT-2007> accessed 1 February 2024; Cal. Penal Code § 647(j)(1) ‘Codes: Codes Tree – Penal Code – PEN’ (Legislation) 1 January 2023 <https://leginfo.legislature.ca.gov/faces/codesTOCSelected.xhtml?tocCode=PEN&tocTitle=±Penal±Code±-±PEN> accessed 1 February 2024.

53 See, e.g., Boring v Google Inc., 362 Fed Appx 273, 38 Media L Rep 1306 (3d Cir Pa Jan 28, 2010) (‘Boring v Google’).

54 For a useful overview of the different causes of action available in New Zealand, Canada, the United Kingdom, and the United States, see Australian Law Reform Commission, Serious Invasions of Privacy in the Digital Era (ALRC Report No 123, June 2014), 22–23, Available at Australian Law Reform Commission, ‘Serious Invasions of Privacy in the Digital Era’ (Report) June 2014 <https://www.alrc.gov.au/wp-content/uploads/2019/08/final_report_123_whole_report.pdf> accessed 1 February 2024. In the UK, privacy interests have also been protect via extensions of the equitable action for breach of confidence.

55 A misuse of private information is recognised in the United Kingdom, while US jurists refer to publicity given to private life: Campbell v Mirror Group Newspapers Limited [2004] UKHL 22; American Law Institute, US Restatement of the Law Second, Torts (1977), § 652D. The New Zealand Court of Appeal confirmed the existence of a tort of wrongful disclosure of private information in its decision in Hosking v Runting [2004] NZCA 34; [2005] 1 NZLR 1.

56 Not all these causes of actions currently exist in the jurisdictions which do recognise privacy torts at common law (or statute).

57 American Law Institute (n 55), § 652B; William Prosser, ‘Privacy’ 48 CLR 383; C v Holland [2012] NZHC 2155; Jones v Tsige (2012) ONCA 32.

58 American Law Institute (n 55).

59 Ibid, § 652B; Jones v Tsige (2012) ONCA 32 (n 57) [17]; C v Holland [2012] NZHC 2155 (n 57) [94].

60 American Law Institute (n 55), § 652D. Note that the UK court have used the formulation ‘misuse of private information’, which may encompass a broader range of activities that disclosure, communication, or publication.

61 Ibid; C v Holland [2012] NZHC 2155 (n 57). The highly offensive test does not form part of the UK.

62 Paul Wragg, ‘Recognising a Privacy-Invasion Tort: The Conceptual Unity of Informational and Intrusion Claims’ (2019) 78 C.L.J. 409.

63 Prosser (n 57).

64 Milner v Manufacturer’s Life Insurance Co (c.o.b Manulife Financial) (2005) BCSC 1661; Brooker v Police [2007] NZSC 30.

65 Blackman (n 5) 313; Nancy D Zeronda, ‘Street Shootings: Covert Photography and Public Privacy’ 63 Vand. L. Rev. 1131, 1145; Stuart Hargreaves, ‘“Jones-Ing” for a Solution: Commercial Street Surveillance and Privacy Torts in Canada’ (2014) 3 Laws 388, 390.

66 Daily Times Democrat v Graham 276 Ala 380, 162 So 2d 474 (1964); Gill v Hearst Publ’g Co, 253 P.2d 441, 446 (Cal 1953).

67 Daily Times Democrat v. Graham (n 66). In the Daily Times case, the respondent was photographed at a county fair at a moment when the wind had blown her skirt into the air. The appellant subsequently published the photograph in a newspaper. Cf. McNamara v Freedom Newspapers, 802 S.W.2d 901, 903 (Tex Ct App 1991).

68 McNamara v. Freedom Newspapers (n 67).

69 Campbell v Mirror Group Newspapers Ltd (n 55).

70 Weller v Associated Newspapers Ltd [2016] 1 WLR 1541; Murray v Express Newspapers Plc and another [2008] EWCA Civ 446.

71 Nicole Moreham, ‘Unpacking the Reasonable Expectation of Privacy Test’ (2018) 134 L.Q.R. 651, 4.

72 Ibid 5.

73 Ibid 1, 8–11.

74 Strachan (n 17) 14.

75 Jones v Tsige (n 57).

76 Boring v Google (n 53) 279.

77 See, e.g., Australian Law Reform Commission (n 54); Campbell v Mirror Group Newspapers Ltd (n 55) [75].

78 Some jurisdictions have codified certain prohibitions on intrusion into seclusion, namely the surreptitious recording of images or conversations through surveillance devices is prohibited. See, e.g., Surveillance Devices Act 1999 (Vic),‘Surveillance Devices Act 1999’ (Legislation) 1 December 2021 <https://content.legislation.vic.gov.au/sites/default/files/2021-12/99-21aa042%20authorised.pdf> accessed 1 February 2024; Surveillance Devices Act 2007 (NSW), ‘Surveillance Devices Act 2007 No 64’ (Legislation) 16 May 2022 <https://legislation.nsw.gov.au/view/html/inforce/current/act-2007-064> accessed 1 February 2024; Listening and Surveillance Devices Act 1972 (SA), ‘Listening and Surveillance Devices Act 1972’ (Legislation) 4 September 2017 <https://www.legislation.sa.gov.au/__legislation/lz/c/a/listening%20and%20surveillance%20devices%20act%201972/2017.12.17/1972.112.auth.pdf> accessed 1 February 2024; Surveillance Devices Act 1998 (WA), ‘Surveillance Devices Act 1998’ (Legislation) 5 April 2023 <https://www.legislation.wa.gov.au/legislation/statutes.nsf/law_a1919.html> accessed 1 February 2024; Listening Devices Act 1991 (Tas), ‘Listening Devices Act 1991’ (Legislation) 5 October 2018 <https://www.legislation.tas.gov.au/view/html/inforce/current/act-1991-021> accessed 1 February 2024; Invasion of Privacy Act 1971 (Qld),, ‘Invasion of Privacy Act 1971’ (Legislation) 5 June 2017 <https://www.legislation.qld.gov.au/view/html/inforce/current/act-1971-050> accessed 1 February 2024.

79 Jamuna Kelley, ‘A Computer with a View: Progress, Privacy, and Google’ (2008) 74 Brook. L. Rev. 187, 214; Andrew McClurg, ‘Bringing Privacy Law Out of the Closet: A Tort Theory of Liability for Intrusions in Public Places’ 73 N. C. L. Rev. 989, 1068.

80 Strachan (n 17); Blackman (n 5); Andrew Lavoie, ‘The Online Zoom Lens: Why Internet Street-Level Mapping Technologies Demand Reconsideration of the Modern-Day Tort Notion of “Public Privacy”’ (2009) 43 Ga. L. Rev 575.

81 The rarity of claims is perhaps unsurprisingly, given would-be claimants find themselves in a paradoxical situation where seeking recourse for privacy invasion might have the opposite and undesirable effect of magnifying the facts or images they wish to keep private (a catch-22 which factored into the Borings’ unsuccessful claim): Boring v Google (n 53).

82 Jordan Segall, ‘Google Street View: Walking the Line of Privacy-Intrusion upon Seclusion and Publicity Given to Private Facts in the Digital Age’ (2010) 10 Pittsburgh Journal of Technology Law & Policy, 1, 14–19.

83 Hargreaves (n 65) 399; McGowen (n 8) 478.

84 Boring v Google (n 53).

85 Pia Grillo v Google (n 45).

86 See e.g., Pia Grillo v Google (n 45) [66].

87 Mathew Weaver, ‘Google Street View Cleared of Breaking Data Protection Act’ The Guardian (London 23 April 2009) <https://www.theguardian.com/technology/2009/apr/23/google-street-view-data-protection-cleared> accessed 4 September 2023. The article cites David Evans, senior data protection practice manager at the UK Information Commissioner’s Office (ICO) stating that the office took a ‘pragmatic and common-sense approach’ to obfuscation as a data protection issue in relation to Google Maps.

88 Colin Bennett and Charles Raab, The Governance of Privacy: Policy Instruments in Global Perspective (MIT Press, 2006); Lisa Austin, ‘Re-reading Westin’ (2019) 20 Theo Inq L 53; Stephen Margulis, ‘On the Status and Contribution of Westin’s and Altman’s Theories of Privacy’ (2003) 59 J. Soc. Issues 411.

89 Colin Bennett, ‘The European General Data Protection Regulation: An Instrument for the Globalization of Privacy Standards?’ (2018) Information Polity 239.

90 Moira Paterson and Maeve McDonagh, ‘Data Protection in an Era of Big Data: The Challenges Posed by Big Personal Data’ (2018) 44 Mon LR 1.

91 Fred Cate, ‘The Failure of Fair Information Practice Principles’ in Jane Winn (ed), Consumer Protection in the Age of the ‘Information Economy’ (Ashgate, 2006); Rubinstein and Good (n 18) 1343.

92 Joris van Hoboken, ‘From Collection to Use In Privacy Regulation? A Forward-Looking Comparison of European and US Frameworks for Personal Data Processing’ in Bart van der Sloot, D Broeders and E Schrijvers (eds), Exploring the Boundaries of Big Data (Amsterdam University Press, 2016).

93 Bert-Jaap Koops, ‘On Decision Transparency, or How to Enhance Data Protection After the Computational Turn’ in M. Hildebrandt and Katja De Vries (eds), Privacy, Due Process and the Computational Turn The Philosophy of Law Meets the Philosophy of Technology (Taylor and Francis, 2013).

94 Daniel Susser, ‘Notice After Notice-and-Consent: Why Privacy Disclosures Are Valuable Even If Consent Frameworks Aren't’ (2019) 9 J. Inf. Policy 37.

95 Paul Schwartz and Edward Janger, ‘Notification of Data Security Breaches’ (2007) 105 Mich. L. Rev 913.

96 Rubinstein and Good (n 18) 1347.

97 Bennett and Raab (n 88).

98 Geissler (n 6).

99 Orla Lynskey, ‘Grappling with “Data Power”: Normative Nudges from Data Protection and Privacy’ (2019) 20 Theo Inq L189, 191.

100 Paul Schwartz and Karl-Nikolaus Peifer, ‘Transatlantic Data Privacy Law’ (2017) 106 Geo.L.J. 115.

101 Mark Burdon, Digital Data Collection and Information Privacy Law (Cambridge University Press, 2020).

102 See discussion regarding the Pia Grillo decision, as an example.

103 Daniel Solove and Paul Schwartz, ‘The PII Problem: Privacy and a New Concept of Personally Identifiable Information’ (2011) 86 N.Y.U.L. Rev. 1814, 1816.

104 GDPR under Article 4(1) as [A]ny information relating to an identified or identifiable natural person (“data subject”); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that person.

105 s6(1) of the Privacy Act as [I]nformation or an opinion about an identified individual, or an individual who is reasonably identifiable: (a) whether the information or opinion is true or not; and (b) whether the information or opinion is recorded in a material form or not. See ‘Federal Register of Legislation – Privacy Act 1988’ (Legislation) 18 October 2023 <https://www.legislation.gov.au/C2004A03712/latest/text> accessed 1 February 2024.

106 Nadezhda Purtova, ‘The Law of Everything. Broad Concept of Personal Data and Future of EU Data Protection Law’ (2018) 10 L.I.T 40.

107 Mark Burdon and Paul Telford, ‘The Conceptual Basis of Personal Information in Australian Privacy Law’ (2010) 17 Murdoch Elaw Journal 1. For clarification of both approaches, see Sharon Booth and others, ‘What are ‘Personal Data’? A Study Conducted for the UK Information Commissioner’ (Final Report) (2004) <http://www.ico.gov.uk/upload/documents/library/corporate/research_and_reports/final_report_21_06_04.pdf> accessed 4 September 2023.

108 Also note the consent considerations as part of the Pia Grillo decision. See, for example, [40] and dissemination of imagery without consent.

109 Burdon and Telford (n 107).

110 Australian Law Reform Commission, For Your Information: Australian Privacy Law and Practice (ALRC Report No 108, August 2008) 309. Available at Australian Law Reform Commission, ‘For Your Information: Australian Privacy Law and Practice Act’ (Report) 12 August 2008 <https://www.alrc.gov.au/publication/for-your-information-australian-privacy-law-and-practice-alrc-report-108/> accessed 1 February 2024.

111 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2018] OJ L 119, Recital 51. Available at European Parliament and Council of the European Union, ‘Regulation – 2016/679 – EN – GDPR – EUR-Lex’ (Regulation) 4 May 2016 <https://eur-lex.europa.eu/eli/reg/2016/679/oj> accessed 1 February 2024.

112 Commissioner Initiated Investigation into Clearview AI, Inc. (Privacy) [2021] AICmr 54 (14 October 2021). (‘Clearview AI’). Available at Angelene Falk, ‘Commissioner initiated investigation into Clearview AI, Inc. (Privacy) [2021] AICmr 54 (14 October 2021)’ (Determination) 14 October 2021 <https://www.oaic.gov.au/__data/assets/pdf_file/0016/11284/Commissioner-initiated-investigation-into-Clearview-AI,-Inc.-Privacy-2021-AICmr-54-14-October-2021.pdf> accessed 1 February 2024.

113 Bonnie Devany, ‘Clearview AI's First Amendment: A Dangerous Reality?’ (2022) 101 Tex.L. Rev. 473.

114 Louise Matsakis, ‘Scraping the Web is a Powerful Tool. Clearview AI Abused It’ Wired (America, 25 January 2020) <https://www.wired.com/story/clearview-ai-scraping-web/> accessed 4 September 2023.

115 Clearview AI (n 112) [74].

116 Ibid [56].

117 Ibid [153].

118 Ibid [155].

119 Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford Law Books, 2010), 113.

120 See discussion above at III.A.

121 Licheng Jiao and others, ‘A Survey of Deep Learning-Based Object Detection’ (2019) 7 IEEE Access 128837.

122 Ibid.

123 Ibid.

124 Bharath Hariharan and others, ‘Simultaneous Detection and Segmentation’ in David Fleet and others (eds), Computer Vision – ECCV 2014 (Springer, 2014).

125 Muhammad Ahmed and others, ‘Survey and Performance Analysis of Deep Learning Based Object Detection in Challenging Environments’ (2021) 21 Sensors 5116.

126 Ross Girshick and others, ‘Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation’ (2013) Arxiv <doi:10.48550/arxiv.1311.2524> accessed 4 September 2023.

127 Joseph Redmon and others, ‘You Only Look Once: Unified, Real-Time Object Detection’ (IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, 27 June 2016).

128 Wei Liu and others; ‘SSD: Single Shot Multibox Detector’ (Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, 11 October 2016).

129 Girshick and others (n 126).

130 Shaoqing Ren and others, ‘Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks’ (2017) 39 IEEE PAMI 1137.

131 Redmon and others (n 127).

132 Liu and others (n 128).

133 Xuan Wang and Zhigang Zhu, ‘Context Understanding in Computer Vision: A Survey’ (2023) 229 Computer Vision and Image Understanding 103646.

134 Ibid.

135 Navneet Dalal and Bill Triggs, ‘Histograms of Oriented Gradients for Human Detection’ (IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), San Diego, 20 June 2005).

136 Zhengxia Zou and others, ‘Object Detection in 20 Years: A Survey’ (2023) 111 Proc. IEEE 257.

137 Nicolas Carion and others, ‘End-to-end Object Detection with Transformers’ (European Conference on Computer Vision, Switzerland, 2020).

138 Zou (n 136).

139 Ibid.

140 Santosh Divvala and others (2009, June). ‘An Empirical Study of Context in Object Detection’ (2009 IEEE Conference on Computer Vision and Pattern Recognition, Miami, 20 June 2009).

141 Mathias Lechner and others, ‘Neural Circuit Policies Enabling Auditable Autonomy’ (2020) 2 Nat. Mach. Intell. 642.

142 Divvala and others (n 140).

143 Yiping Gong and others, ‘Context-Aware Convolutional Neural Network for Object Detection in VHR Remote Sensing Imagery’ (2020) 58 IEEE Trans. Geosci. Remote Sens. 34.

144 Suyuan Liu and others, ‘HideSeeker: Uncover the Hidden Gems in Obfuscated Images’ (Proceedings of the 20th ACM Conference on Embedded Networked Sensor Systems, Boston, 6 November 2020).

145 Jimmy Tekli and others, ‘A Framework for Evaluating Image Obfuscation Under Deep Learning-assisted Privacy Attacks’ (2023) 82 Multimedia Tools and Applications 1 <doi:10.1007/s11042-023-14664-y>.

146 Ibid.

147 Ibid.

148 Ibid.

149 Richard McPherson and others, ‘Defeating Image Obfuscation with Deep Learning’ (1 September 2016) Arxiv <https://arxiv.org/abs/1609.00408> accessed 4 September 2023.

150 William Croft and others, ‘Obfuscation of Images via Differential Privacy: From Facial Images to General Images’ (2021) 14 Peer-to-Peer Netw. Appl. 1705.

151 Pierangela Samarati and Latanya Sweeney, ‘Protecting Privacy when Disclosing Information: k-Anonymity and its Enforcement through Generalization and Suppression’ (Technical Report) (March 1998) Semantic Scholar <https://www.semanticscholar.org/paper/Protecting-privacy-when-disclosing-information%3A-and-Samarati-Sweeney/7df12c498fecedac4ab6034d3a8032a6d1366ca6> accessed 4 September 2023.

152 Croft and others (n 150).

153 Yifan Wu and others, ‘Privacy-Protective-GAN for Privacy Preserving Face De-Identification’ (2019) 34 J. Comput. Sci. Technol. 47.

154 McPherson and others (n 149); Seong Joon Oh and others, ‘Faceless Person Recognition; Privacy Implications in Social Media’ (European Conference on Computer Vision, Netherlands, 11 October 2016).

155 Eman Hassan and others, ‘Cartooning for Enhanced Privacy in Lifelogging and Streaming Videos’ (IEEE Conference on Computer Vision and Pattern Recognition Workshops, Honolulu, 21 July 2017).

156 Karla Brkic and others, ‘I Know that Person: Generative Full Body and Face De-identification of People in Images’ (IEEE Conference on Computer Vision and Pattern Recognition Workshops, Honolulu, 21 July 2017).

157 Qianru Sun and others, ‘Natural and Effective Obfuscation by Head Inpainting’ (Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, 18 June 2018).

158 Jakob Geyer and others, ‘A2D2: Audi Autonomous Driving Dataset’ (14 April 2020) Arxiv <https://arxiv.org/abs/2004.063202004.06320> accessed 4 September 2023.

159 Tekli and others (n 145).

160 Ibid.

161 Peter Schaar, ‘Privacy by Design’ (2010) 3 Ident. Info. Soc. 267.

162 Ann Cavoukian, ‘Privacy by Design: The 7 Foundational Principles’ (August 2009) Ontario <https://www.ipc.on.ca/wp-content/uploads/resources/7foundationalprinciples.pdf> accessed 4 September 2023; Ann Cavoukian, Privacy by Design in Law, Policy and Practice (Canadian Electronic Library, 2011).

163 Alan Charles Raul and others, ‘Privacy by Design and Data Minimisation’ Global Data Review (8 April 2022) <https://globaldatareview.com/guide/the-guide-data-critical-asset/edition-1/article/privacy-design-and-data-minimisation> accessed 4 September 2023.

164 GDPR (n 111) art 25(1).

165 Lee Bygraves, ‘Data Protection by Design and Default: Deciphering the EU‘s Legislative Requirements’ (2017) 4 Oslo L. Rev. 105, 117; Bert-Jaap Koops and Ronald Leenes, ‘Privacy Regulation Cannot be Hardcoded. A Critical Comment on the ‘Privacy by Design’ Provision in Data-protection Law’ (2014) 28 I.R.L.C.T. 159.

166 GDPR (n 111) Recital 78.

167 Bygraves (n 165) 119.

168 Ibid.

169 Seda Gurses and others, ‘Engineering Privacy by Design’ (2011) 14 Computers, Privacy & Data Protection 25 <https://software.imdea.org/~carmela.troncoso/papers/Gurses-CPDP11.pdf> accessed 4 September 2023.

170 Koops and Leenes (n 165).

171 Gurses and others (n 169).

172 Rubinstein and Good (n 18) 1358 and the focus of ‘privacy engineering’ and ‘on what companies can do to build privacy protections into their own systems.’

173 Brunton and Nissenbaum (n 1) 50 stating ‘ … we can better understand acts of obfuscation within a context of unavoidable relationships between people and institutions with large informational and power asymmetries.’

174 Ibid 1 ‘Obfuscation is the deliberate addition of ambiguous, confusing, or misleading information to interfere with surveillance and data collection.’

175 Brunton and Nissenbaum (n 1) 95.

176 Ibid 95.