760
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Migrating the state into corporate clouds

ORCID Icon & ORCID Icon
Received 15 Oct 2023, Accepted 21 Mar 2024, Published online: 24 Apr 2024

ABSTRACT

This article examines the cloudification of the state by focusing on ‘Project Nimbus’ – a $1.2 billion tender offered by the Israeli government to move ‘the entire Israeli state’ into corporate clouds. As one of the biggest informational infrastructure projects Israel has known, Nimbus revolves around the construction or repurposing of six massive, resource-hungry corporate data centers and the ‘migration’ of the state into them. Accordingly, Nimbus potentially redraws the relationship between the state and big tech, as well as between the state and its citizens. Based on thematic analysis of various project-related sources, we highlight four main aspects of the ‘cloudification of the state.’ First, focusing on cloud ideology, we expose the neo-liberal logics behind the construction of the Nimbus data centers. Second, focusing on cloud epistemology, we show that the effort and organizational changes that come with the cloudification of the state translate into an acute dependence on the tech giants. Third, focusing on the DCaaSization of the state, we argue that with its reliance on DCaaS (‘data center as a service’), the state's ‘behind the scenes’ is fading into the tech giants’ backstage. Lastly, we argue that this cloudification translates into disseminating advanced AI-powered data processing tools alongside pronounced data-ist Silicon Valley ideologies. We conclude by arguing that the construction of the Nimbus data centers and the subsequent cloudification of the state enhance power and information asymmetries between the state and big tech and discuss the data-coloniality of this new cloudified power structure.

In her now classic paper, Susan Leigh Star (Citation1999) called upon social scientists to study the ‘boring things’ in life, such as bridges, sewer grates, and other infrastructural components that recede into the background – yet make modern life possible. This has led to the so-called infrastructural turn in social research (Edwards et al., Citation2009; Hesmondhalgh, Citation2021; Plantin & Punathambekar, Citation2019), focusing on material and social invisible practices that underpin communicative technologies. Nothing epitomizes this turn more than the central object of this article – data centers (Holt & Vonderau, Citation2015). These mundane structures are built to accommodate the computer servers, wires, and accompanying equipment to route traffic, analyze data, and serve content to internet companies and users. Situated in strategic locations, they promise to serve connection speeds (lag), safety concerns (redundancy and security), and environmental requirements. Once erected, they are largely nondescript. From the outside view, they are profoundly boring.

This article explores six such boring data centers recently built or repurposed in Israel. They are part of a $1.2 billion tender offered by the Israeli government to move its entire computational infrastructure ‘to the cloud,’ won by Amazon Web Services (AWS) and Google Cloud Platform (GCP) in 2020 [4,5,11]. As part of the winning bid, each company is required to build or repurpose three data centers set at least 25 kilometers apart [11]. With the new data centers, the Israeli government expects to move ‘the entire Israeli state’ – namely, the government, the public sector, the Israel Defense Force (IDF), and more – into these new ‘clouds’ and encourage big private businesses like banks, insurance companies, tech companies, and others to follow suit.Footnote1

Focusing on the ‘migration’ of governmental data into foreign corporate hands, we ask: How does the cloudification of the state influence its power balance with big tech in light of the former’s duty to its citizens? To answer, we explore ‘the cloudification of the state’ through a thematic analysis of dozens of official documents and training materials released by Israel and the tender winners themselves. After explaining how infrastructural objects like data centers became a key topic in critical media and data scholarship, we delineate how Project Nimbus redraws the relationship between the state and big tech through four central premises. First, focusing on cloud ideology, we highlight the neo-liberal logics behind the construction of the Nimbus data centers and the ensuing cloudification of the state. Second, focusing on cloud epistemology, we show that to migrate to the cloud, the state must laboriously restructure and standardize its various databases according to the tech giants’ standards and epistemic cultures (Knorr-Cetina, Citation1999). We argue that the effort and organizational changes that come with such restructuring translate into a ‘functional irreversibility’ (Lampland & Star, Citation2009, p. 15) and an acute dependence of the state on the tech giants. Third, focusing on the DCaaSization of the state, we argue that the cloud’s reliance on a DCaaS (‘data center as a service’) financial model is turning various governmental services into ‘managed services’ that are run by and are in the responsibility of the commercial cloud providers while the state's ‘behind the scenes’ is fading into the tech giants’ black-boxed backstage. Lastly, we argue that the cloudification of the state translates into the dissemination of advanced AI-powered data processing tools alongside pronounced data-ist Silicon Valley ideologies (van Dijck, Citation2014), once only available to particular securitized branches of government. This elevates AI as a powerful tool for the state but locks it behind corporate terms, rules, and affordances, in effect DCaaSizing AI. We conclude by arguing that the construction of the Nimbus data centers and subsequent cloudification enhance power and information asymmetries between the state and big tech and discuss the data-coloniality (Couldry & Mejias, Citation2019) of this new cloudified power structure.

Literature review

An infrastructural approach to nimbus

Often conceived as monumental and impregnable, infrastructures are the ‘stuff in the background’ that facilitates everyday actions in the world. Plantin et al. (Citation2018) trace the scholarly pursuit of infrastructural thinking from two distinct traditions: a historical perspective on large technical systems (LTS) and a sociological and phenomenological perspective on how infrastructures operate in the everyday. While not wholly separate, the two can be aligned historically. The first approach saw social scientists taking an interest in how complex systems such as the electric grid (Hughes, Citation1993) or transportation (Latour, Citation1996) come to be, how they are maintained, and what different actors participate in their making and unmaking. Broadly speaking, this approach was characterized by attempts to discern distributed governance. Since such infrastructures require multiple parties to work within a broad semi-bounded field, ‘fully developed infrastructures are complex ecologies whose components must continually adapt to each other’s ongoing change’ (Plantin et al., Citation2018, p. 296). It evokes a traditional vision of infrastructures – vast, visible, monolithic – even when attempting to unpack or demystify them.

The second approach emerges alongside the continuous propagation of infrastructures into the everyday with the advent of computing. While arguably always present in habitual use – one must simply remember McLuhan’s lionization of the electric grid as a medium of progress (Elder, Citation2021) – it was the computerization of work and subsequent digitalization of leisure that made digital infrastructure into an ever-present background concern, enabler of multitudes of otherwise impossible actions. This newer approach focuses on being socialized into infrastructures and – in contrast to the heroic view of the first approach – paints them as something that is invisible to most and appears only when it breaks down or when one is trained to see it (Leigh Star, Citation1999; Bowker & Star, Citation2000). Historically, the rise of scholarly attention to the infrastructuring of the everyday aligns with what Agre (Citation1994) named ‘grammars of action’: changes occurring in processes and procedures so that they can be better captured within the incipient computerized ontology: when one’s organization becomes dependent on an email instead of written hand-notes, specific actions that were previously possible are no longer so. It is particularly visible within a growing research interest in media infrastructures, highlighting the materiality of our seemingly ephemeral media practices and the way communication infrastructures are often nested on top of previous material and social practices (Mattern, Citation2015; Parks, Citation2015; Parks & Starosielski, Citation2015). Nevertheless, both approaches intersect and are in dialogue more than opposition. One can see the bridging and transition of the two approaches with the early suggested ‘agenda for infrastructure studies’ that warned against categorizing infrastructure according to types and approaches and insisting that ‘there really is just one field here’ (Edwards et al., Citation2009, p. 372).

This duality feeds well into another twin conceptual framework, which Plantin and others (Plantin et al., Citation2018; Poell et al., Citation2019; Zhang, Citation2021) have identified as a bifocal perspective on infrastructures and platforms. As we will show throughout our paper, Project Nimbus can act as both – depending on the analysis level and the perspective of the beholder. Accordingly, as expended below, despite data centers’ popular perception, data storage is far from their main proposition, with data operations (or specialized software running remotely) being a growing key component of their business model (Narayan, Citation2022).

Infrastructural thinking helps us understand how the recently built data centers, Nimbus as a cloud platform, and the state itself become enmeshed. Ultimately, however, we must also be wary of vagueness found in the infrastructural turn. David Hesmondalgh (Citation2021) lists several common tropes in this scholarship that undermine its analytic usefulness. For instance, if ‘materiality’ can mean things like bureaucratic forms or learned practices within communities of knowledge – what separates them from immaterial things? In the spirit of this critique, we will outline our conceptualization of data centers vis-à-vis the cloud in the subseqent section. First, we examine the data in the data center and its relation to the national and international flows. Then, we expand this understanding onto the structure of the data center itself in its visibility, securitization, and power.

Data colonialism, infrastructures, and the state

Currently, a central premise in thinking about data is through the prism of its continuous extraction and exploitation. Data colonialism (Couldry & Mejias, Citation2019, Citation2023) is one approach that shares some common ground with other critical approaches to data, such as platform capitalism (Srnicek, Citation2016), surveillance capitalism (Zuboff, Citation2015), and critical data studies (Eubanks, Citation2017; Milan & Treré, Citation2021; Noble, Citation2018) in foregrounding the continuous extraction of value from human life through data. However, the data colonialism approach is distinctive in repositioning those critiques explicitly within colonialism's centuries-old relations to capitalism. Couldry and Mejias see modern encroachment of data exploitation from Western (and, to some degree, Chinese) tech giants as a direct rather than metaphorical (Thatcher et al., Citation2016) continuation of colonial practices of landgrab and appropriation. If data is indeed the new oil – as the current truism goes and contra to Gitelman’s (Citation2013) aphorism – then it stands to reason that ‘data practices today would represent not just a continuation of colonialism/capitalism, but a distinctive new stage of colonialism that lays the foundations for new developments in capitalism, just as colonialism’s original landgrab enabled capitalism’s emergence and subsequent centuries of colonial oppression’ (Couldry & Mejias, Citation2023, p. 788).

Data’s materiality similarly highlights its colonial ties. In her work on tracing infrastructure as methodology, urban geographer Cowen (Citation2020) shows how, by following the archival traces of the Canadian Pacific Road (CPR), one begins to untangle the making of Canadian national infrastructure through its financing by the Atlantic slave trade; dispossession – and even genocide – of indigenous people; the Empire logic of military-mercantile elites overlooking the process; the racial segregation aimed at black and Chinese workers during its construction; and more. Infrastructure, Cowen shows, is inherently colonial and that its famed invisibility often supports the status quo and is only interrupted – to become visible – by the acts of those on its margins. Building on this observation, Kathryn Furlong notes that data centers exhibit this colonial tendency as they ‘[are] combined with the increasing violence of ‘supply chain security’ enabled through this same surveillance and the logistics it supports’ (Furlong, Citation2021a, p. 193). In other words, the nation-state provides the corporate data center with the stability that, in turn, enables its contemporary dominance (Furlong, Citation2021b).

How do the material realities of Nimbus’ storage and operation activate on the state level? After all, if – as proclaimed by the project leaders – the aim is to move ‘the entire state into the cloud,’ it stands to reason that traditional instances of state power will be enhanced and extended by the infrastructural power of data centers. This calls into question the relation of (colonial) histories of data in relation to globalization (Appadurai, Citation1996; Citation2001; Plantin & Punathambekar, Citation2019), logistics (Rossiter, Citation2015), and national specificity of data work (Avnoon et al., Citation2023). After all, real harms have been recorded in the imposition of (US) tech giants into local markets, particularly in the global south where racial and linguistic biases have facilitated propagation of harmful stereotypes and even governments’ profiling and incarceration (Chonka et al., Citation2023; Kalema, Citation2023). But, similarly, it has become a hot-button topic for both the EU, fearful for its data sovereignty and of US security surveillance post-Snowden (Ortiz Freuler, Citation2023; Pannier, Citation2021). The ill-fated Gaia-X project, for instance, shares some similarities with Nimbus in an attempt to create a pan-European standard for cloud services while grounding them in EU legal frameworks. While initially envisioning a complete buffer from US tech, it ran into multiple issues with technology requirements and deadlines, opening itself up to additional consortium members that included US (and Chinese) companies – essentially undermining itself in the process to some critics (Goujard & Cerulus, Citation2021). Another example is the French attempt to construct a national cloud infrastructure called Bleu, which is based on Google and Microsoft cloud tech, but according to its designers, ‘has four safeguards will guarantee the clouds immunity to US law: 1) 100% French or European capital; 2) day-to-day operations carried out by Bleu as a stand-alone company, without Microsoft’s intervention in management; 3) separate data centers, owned by Bleu; and 4) the control of ANSSI (French national cybersecurity agency).’ (Pannier, Citation2021, p. 4). Crucially, Bleu is a commercial initiative rather than a governmental one, and it operates out of existing (French) data centers rather than mandating the creation of new ones.

To understand Nimbus, we postulate that data centers are the infrastructure of infrastructure (Peters, Citation2015). Data centers store and operate on digital data. By doing so, they mould our relations with it in multiple ways, such as temporal restructuring of access rhythms (Munn, Citation2022; Velkova & Plantin, Citation2023) or the commodification of previously-militarized locations for secure storage (Johnson, Citation2019; Taylor, Citation2021; Velkova, Citation2023). As more and more mediated relations, from apps to streaming services, depend on data flows, those need to be stored and managed. They simultaneously represent the calculability of the world while determining what and who can be ‘rendered perceptible and calculable.’ (Amoore, Citation2020, p. 33). The need for ever-growing quantities and speeds has led to the proliferation of new telecom competitors and Content Delivery Networks (CDNs) that act parallel to public global internet traffic (Winseck, Citation2019). This warrants the construction of new networks of global corporate data flows. And the states, feeling both a threat to their power and a potential to expand it, are entering the fray.

Literature on state-corporate relationships teaches us to be careful when assuming a unidirectional relationship between the two. Particularly, recent work on digital infrastructures is skeptical of the data colonialism framework as centreted on the US tech giants. While maintaining a hegemony over the content distribution part of the tech stack, Winseck (Citation2019) shows that internet giants are often just a secondary player when it comes to investing into actual transmission infrastructure (data cables). Instead, they buy data capacity from secondary providers – often national or publicly-owned telecoms – while heavily investing in ‘data factories on either side of the Atlantic Ocean that allow them to warehouse the vast stores of data they collect and to bypass the undersea cables as much as possible altogether.’ (p. 108). This includes the practice of edge caching, or predictively storing certain data closer to the intended source for local distribution (Helles & Flyverbom, Citation2019; Sandvig, Citation2015). In doing this these giants attempt to distance themselves from their perceived role in US hard and soft power, and signal a neutral adherence to local regulation (Ortiz Freuler, Citation2023). In this, the massive (hyperscale) data center is as much an attempt to curry favor with various local governments as is a pure profit-driven enterprise.

Recent research on data centers focused on their materiality (Brodie, Citation2020; Hogan, Citation2015; Levenda & Mahmoudi, Citation2019) and their environmental impact (Lannelongue et al., Citation2021; Pasek, Citation2019) – highlighting the climatic costs of such massive infrastructures and the amounts of water and electricity they require to function. Others have focused on the local politics surrounding the construction of such massive data infrastructures (Lehuedé, Citation2022; Rone, Citation2023), showing how citizens resist, contest, and challenge the construction of such resource-hungry infrastructures. This article, in turn, contributes to this line of research by highlighting ideological and epistemic consequences of the construction and launch of data centers vis-a-vis state sovereignty and of migrating an entire state into corporate clouds.

Methods

This article is part of a larger research project about Project Nimbus and the cloudification of the state. It is based on a vast corpus of public data that includes government documents about Project Nimbus (tenders, reports, announcements, summaries, and more); online content from various Nimbus-related websites, including government websites, AWS’s and GCP’s websites, Nimbus’s Cloud Center of Excellence (CCoE) website, and more. While most of Nimbus’s cloud training sessions are not public, we obtained a few GCP webinars and have transcribed them. We also collected tweets about Project Nimbus and the Israeli Cloud in Hebrew and English. Lastly, we amassed Israeli journalistic pieces about the project. The data was logged into MaxQDA22 and were analyzed by the authors using thematic analysis (Braun & Clarke, Citation2006). We read and reread the data to identify recurring themes and coded the texts using these themes. Following the initial coding, we selected key documents, translated prominent quotes representative of each theme, and analyzed them using critical discourse analysis (Wodak & Meyer, Citation2009). Appendix 1 presents an overview of the 18 documents and training webinars analyzed in this paper, including ten from official governmental sources, three by GCP, two by AWS, and two from the consultancy group KMPG, advising the state in this transition. References to the items mentioned in the analysis consist of the items’ ordinal numbers in the appendix (1-18).

Notably, the construction of GCP’s and AWS’s Nimbus data centers has only recently ended, and their Israel Cloud Regions have only recently launched (GCP launched in October 2022, AWS in August 2023). Accordingly, the state’s ‘migration to the cloud’ has only begun. Nevertheless, the fact that the Nimbus data centers are currently half-built assemblages (Burrell, Citation2020) only highlights the socio-political drama around them, exposing the oft-invisible ideational, political, and epistemological bolts and screws of such big-data infrastructure and offering a closer look at the different actors and stakeholders involved in creating and sustaining this global (data) regime.

Findings

Cloud ideology

Migrating a nation-state into ‘the cloud’ implies deep privatization of vast datafied aspects of the state – its data storage, computing power, data protection, data-driven applications, and more – and their transfer into the hands of Silicon Valley corporations. By that, the state willingly delegates responsibility over citizens’ data to private corporate hands, thus offering a renewed view of the ties between the state and big tech. In fact, the cloud providers explicitly discuss the new ‘responsibility model’ that comes with the cloud. As Google's representative explained in one of their webinars:

‘The core of cloud computing […] revolves around the shared responsibility model – shared responsibility and redrawing the boundaries around this responsibility. When it comes to an on-premises setup, whenever I develop an app, the entire realm of security rests on my shoulders: the servers, their physical whereabouts, their accessibility, and securing my application […] – all of these facets are solely under my responsibility because I have implemented them on my own machines, thus assuming complete responsibility. When I transition to the cloud, whether it's Google's or any other cloud, the underlying responsibilities, such as hardware, storage, communication, or log auditing, all transfer to the hands of the cloud vendor. What does this mean? It grants me a significant amount of time to focus on what truly matters: my application. As a customer [of the cloud vendor], […] my goal is to deliver and create value for my customers by [focusing on] the app itself. ’ [8]

The ‘shared responsibility model’ is often mentioned in GCP's and AWS's training sessions and documents and is considered a standard among cloud providers [see 8, 9]. Indeed, with the cloud, organizations replace their on-premises tangible – and often complex and expensive – informational infrastructures with cheaper virtual machines under the purview of much bigger companies. Nevertheless, while the shared responsibility model usually refers to the relationship between private entities (often startups) and the cloud giants, with Nimbus, what is shared is the state's responsibility towards its citizens’ data across the state's various ministries and offices.

Such wholesale delegation of the state's responsibilities to its citizens echoes a starkly neoliberal logic – one that (in)famously seeks to minimize governments’ roles and hand them over to private, for-profit organizations (Braedley & Luxton, Citation2010; Ilcan, Citation2009). Accordingly, and in line with Israel's longstanding tendency toward neoliberal governance (Maron & Shalev, Citation2017), the Nimbus clouds are surrounded by market-oriented discourses. For example, the GCP representative in this excerpt described the state as a ‘customer’ that aims to ‘deliver and create value’ for its customers by better developing its ‘app.’ Another Google representative at GCP’s cloud webinar similarly said:

At the end of the day, we are talking about economic efficiency. […] We're not here just for the sake of it; we came here to be efficient. To generate value for our customers. Whether it's economic value or value that saves time or effort. We always consider the economic aspect when we design [cloud] architecture. [8]

Throughout their cloud webinars, GCP's representatives repeatedly treat the state as if it were a tech company that uses Google's cloud infrastructure to ‘bolster innovation’ and increase revenue rather than a sovereign, democratic state that seeks the good of its citizens. Interestingly, many of the state's documents surrounding Nimbus convey a similar logic. For example, a report by Israel's National Digital Agency explains how migrating to the cloud lets the state ‘focus on its core activities.’ They write:

Managing the cloud infrastructure frees the organization from the need to manage and maintain IT infrastructures, allowing information systems managers to focus on core operations – developing applications and improving the organization's business results.[10]

Echoing Google's narrative, as well as popular tropes among today’s neoliberal states (Maron & Shalev, Citation2017), this quote praises the cloud responsibly model, accordingly describing the state as a private for-profit organization aiming to maximize its ‘business results.’ Curiously, the ‘customers’ around which this ‘business’ operates (namely, the citizens) are rarely mentioned – here and throughout Nimbus's documents.

A document by the Government's ICT Authority echoes this logic. They write:

The authority will leverage the transition to the cloud as an opportunity to embed ‘agile'/'lean’ methodologies as an integral part of the cloud environment, enabling not only cost reduction of cloud services but also facilitating organizational transformation."[10]

This excerpt treats the state as a tech company that seeks a more flexible and cost-effective (technological) development. By signaling that migrating to the cloud would make the state more ‘agile,’ these reports specifically embrace a highly popular trope among today's start-ups – agility. The agile methodology originated in software development (Mergel et al., Citation2021) but has since been adopted across various industries. Agile methodologies allegedly allow startups and other businesses to be ‘flexible,’ ‘responsive,’ and ‘efficient’ in their operations, enabling them to quickly iterate on their products or services based on market demands and user feedback. In this case, these values transfer from the realm of tech companies to that of the state, framing its cloudification as a positive move towards more efficient and profitable management.Footnote2

That is, the cloudification of the state comes with a strikingly neoliberal, privatizing, profit-seeking ideology, one that revolves around maximizing the state's ‘profits’ while minimizing its responsibility towards its citizens (as data subjects) and delegating it to big tech. This ideology similarly treats the state and its ministries as ‘organizations’ that should seek ‘agility’ and ‘flexibility’ rather than as governmental bodies that seek the well-being of their citizens. Nevertheless, beyond the starkly neoliberal ideology that surrounds it, Project Nimbus also has wide epistemic implications.

Cloud epistemology

With Nimbus, each state office is expected to independently restructure its internal database to fit AWS's or GCP's specific cloud architectures. Currently, the majority of Israel's data is stored ‘on-premises’ – in local and very tangible computational infrastructures (servers, hard drives, processors, networking equipment, etc.) located in dozens of ministries’ computer rooms and server farms. Some state data is already stored in corporate clouds but with no uniform constricting structure and across a variety of cloud providers. As the state's comptroller wrote in a 2021 report, Israel's state's data is currently organized and stored in innumerable methods and systems with various independent, explicit, or implicit data architectures [7].

Accordingly, one of Nimbus's key goals, often mentioned in state documents, is standardization – organizing the state’s data according to one (or two) unified data (infra)structures. The Nimbus tender promises that such standardization would lead to a more ‘efficient’ and cost-effective informational landscape, better communication between the state’s different branches, and more [3, 10, 14]. Nevertheless, standardization always comes at a price. After all, widespread standardization is an act of power (Busch, Citation2011), as information is organized into external, preexisting, and often universalizing systems. In this case, the localized, varied, often disorganized governmental data is being standardized according to the specific epistemic structures of the tech giants’ clouds in light of specific Silicon Valley-based epistemic cultures (Knorr-Cetina, Citation1999). For example, to migrate, the state’s data needs to be ‘rehosted,’ ‘re-platformed,’ or ‘refactored,’ and applications often need to get ‘re-architectured’ to fit the cloud's structure [6, 12, 13, 14]. While the first two methods involve moving data to the cloud with relatively minimal modifications, re-architecting offers a more robust method, as it involves a significant overhaul of applications’ architecture and design to ‘fully leverage the cloud's capabilities, [14]’ and it similarly involves rethinking and redesigning applications to ‘take advantage of cloud-native features. [14]’

Thus, migrating to the cloud entails succumbing to Silicon Valley’s epistemic power through standardization, and such standardization is overwhelmingly expensive and laborious (Lampland & Star, Citation2009). As a case in point, to migrate its data from its scattered local infrastructures to its cloud-based version, the state conducted two more expensive tenders – one to select a consulting firm that would devise the state's strategy for the state's ‘cloud journey’ and the other to select dozens of local companies that would ‘assist [government offices] in the execution of development, modernization, and cloud migration processes’ [18]. This effort is also exemplified in the thorough organizational restructuring that comes with cloudification. To move to the cloud, every ministry is expected to appoint or train a long list of cloud practitioners: a Ministry Cloud Lead, Chief Cloud Architect, Cloud Security Engineer, and more. Interestingly, AWS and GCP require different knowledge and skills [14, 15, 16]. Both companies are offering elaborate training programs for state employees that include webinars [8,11,12], dedicated online courses, cloud simulators, calculators, and an opportunity to test the clouds’ various features in dedicated mock-cloud environments. These various training tracks help current IT personnel train into a new, cloud-centered professional identity, with its specific competencies, technologies, and epistemologies. As the Government's ICT Authority wrote in one of its policy papers: ‘The computer room will no longer be located in the organization's premises, and the organization’s IT personnel will become the entities that define needs, manage, and supervise the [cloud] service provider [10].’

Thus, to migrate to the cloud, each state ministry independently and laboriously develops new knowledge, tools, skills, and organizational structures to align with the epistemic structure and logic of the tech giants’ clouds. This effort almost inevitably translates into acute dependence of the state on the tech giants. After all, as Lampland and Leigh-Star argued following Callon (Citation1998), standardization often leads to a ‘functional irreversibility’ (Citation2009, p. 15), and further social and material costs in the endless pursuit of ‘optimization’ (McKelvey & Neves, Citation2021). In the case of Nimbus, reversing the state's data back into an ‘on-premises’ status and dismantling the organizational structures and standards built around the corporate-specific clouds is expected to be enormously costly and is, hence, highly improbable. This mirrors the setbacks in dreaming of a European ‘sovereign cloud’, due to the difficulties of either developing the local cloud sector from scratch or attempting to specify different standards as to not fall under US legal purview (Ortiz Freuler, Citation2023; Pannier, Citation2021). As with Nimbus, the economic costs eventually drive the state to adapt to the tech giants rather than the other way around. This heightened dependence is characterized by the cloud’s specific economic structure, which we term DCaaSization – locking the customer into a Data Center as a Service model.

DCaaSizing the state

Nimbus requires that individual government offices independently sign up for the service as a pay-as-you-go model or an annual or tri-annual contract. With DCaaS, the state’s ministries and offices lease access to the servers, networking, storage, platform, software, or other computing resources owned by the cloud provider and are billed monthly, in US dollars, according to their actual use [10, 13]. As a Nimbus document explained: ‘The adoption of Cloud services is linked with a move to a consumption-based charging model where services are costed per user, transaction or utilized capacity.’ [15]. Hence, like any other consumer, but with a 35-45% governmental discount [11], individual government offices are expected to lease cloud services, choosing the contract that would ‘fit them best.’ AWS and GCP offer simulators and calculators to help Nimbus users assess their projected monthly quota.

DCaaS is further fragmented into different levels of ‘service.’ Nimbus's clouds (like most contemporary paralles) afford at least three ‘as a service’ models of operation: infrastructure as a service (IaaS) – which affords leasing ‘virtual machines (VMs)’ – storage and computing power from the cloud operator; platform as a service (PaaS) – leasing virtual machines with operating systems, runtime environments, web servers or other middleware; and the more well-known software as a service (SaaS) which offers a wide range of specific software that is accessed and run through the web-based cloud interface. As is repeatedly explained in the Nimbus documents and training sessions [8, 10, 11, 13, 17], each level delegates more responsibility from the state to the cloud vendors. For example, with IaaS, the state hands off to google the assembly, management, maintenance, and security of its computational infrastructure, but with PaaS and particularly SaaS, more and more datafied aspects of the state are delegated to the tech giants [13]. As a Google representative explained in a Nimbus training:

[Cloud migration] becomes more attractive and basically better as you progress from infrastructure as a Service (IaaS) to Platform as a Service (PaaS), and the most attractive aspect being Managed Service, or Software as a Service (SaaS), where the majority of responsibilities are thrown over to the cloud provider. Think about it – I have Gmail. That's SaaS, and I don't care what's happening behind the scenes. I use it, I have my UI, and that's it. All the backstage action is done by Google. Everything I can take to a place of managed service, I will strive to do so [8].

Hence, the DcaaSization of the state not only entails pronounced dependence on the tech giants and the transfer of state data to the giants’ hands but also pronounced efforts to turn various state services into ‘managed services’ – namely, services that are run by and are in the responsibility of the cloud empires. In fact, according to Israel's ‘Cloud Strategy,’ state officials should always favor SaaS solutions over local ones [17], and these services should be provided by the Cloud Provider – AWS or GCP.Footnote3 This harkens to Amoore’s (Citation2020) warning regarding the deep ontological reliance on cloud infrastructure and its categorization. Additionally, as Google’s representative explains, the state is not only relying on the tech giants to manage its computational infrastructure and store and compute its data but the state's ‘behind the scenes’ is fading into the tech giants’ auspices and state officials are expected to disregard what is happening there (‘All the backstage action is done by Google’).Footnote4 In the case of data-intensive applications, such ‘behind the scenes’ may include questions regarding the collection and use of users’ (citizens’) data, privacy concerns, biases in the data, the carbon footprint of running such applications, and more. Forgoing the state’s responsibilities toward its ‘datafied backstage’ inevitably assumes big tech would run it better. While we cannot be sure how those aspects will impact the relations between the state and citizens – due to the early stages of implementation and opaque data sharing practices – historical lessons are disconcerting. Accelerated during the COVID-19 pandemic, there is a shift towards the assumption of civil responsibilities like health or education by commercial entities (Lyon, Citation2023) predicated on massive dataveillance which is presented as more innocuous and apolitical when it is integrated into the very infrastructure of daily life (Gekker & Hind, Citation2020). In Israel, this shift have been particularly accelerated due to the public’s higher acceptance of the securitization discourse towards health and tracking (Ken-Dror Feldman et al., Citation2020). These questions become more pronounced when it comes to cloud-based AI tools.

DCaaSizing AI

In a GCP’s webinar dedicated to Big Data and ML in the cloud, one of the slides contained a dramatic picture of a space shuttle taking off below a headline that reads: ‘If ML is a rocket engine, data is the fuel.’ Indeed, one of the key selling points of the cloud giants to encourage cloud migration is the ability to leverage the state's data using their cloud's advanced AI/ML tools. Accordingly, as we will demonstrate below, the cloudification of the state also translates into the dissemination and simplification of advanced AI-powered data processing tools, once only available to very specific branches of government. Such tools potentially open up new ways of analyzing, seeing, and interpreting citizens’ (and non-citizens’) data and, in effect, better surveilling, controlling, and social sorting them (Lyon, Citation2003). For example, in GCP’s ‘Big Data and ML’ webinar, the company’s representative described today’s extensive data collection (‘I hope I'm not scaring anyone here!’). Mentioning Instagram and Netflix as prime examples, she said:

‘One of the big challenges with big data is extracting insights. [For example,] today, when we watch Netflix, we want to get good recommendations based on what we watch – [we want Netflix to] essentially extract insights and value from mountains of data. Think about it – how do we find logic within diverse and scattered piles of information? How do we analyze visual or textual data […] and actually derive logic from it?’ [9].

Conveniently ignoring her own company’s role in contemporary dataveillance (van Dijck, Citation2014), Google’s representative echoes the aforementioned neoliberal view of the state as a private tech company that aims to ‘extract insights and value’ from its user data. She similarly ignores the fact that, unlike Netflix or Instagram, the data in question is citizens’ data and that, accordingly, these data subjects cannot opt-out from its collection nor leave the service and ask to redact or expunge their data trail. Hence, through this webinar, GCP spells out the question government officials should allegedly be asking – how can they extract value from state data? – only to offer an allegedly taken-for-granted solution – the cloud’s advanced data processing tools.

In the spirit of DCaaS, the cloud vendors offer such tools in various levels of AI/ML functionality, and accordingly, each office can finetune the level of responsibility they delegate to the cloud vendors regarding AI/ML. For example, state officials (like any cloud consumer) can either manually write, train, and optimize ML models while only using the cloud’s computational infrastructure (CPUs, GPUs, storage, and more); they can alternatively train AWS's or GCP's models with their own data with almost no coding (using ‘AutoML’) or use the companies’ pre-trained AI/ML models using dedicated APIs and analyze their data ‘right away,’ with no programming or pretraining [8, 9]. Such off-the-shelf cloud-based AI tools are varied, including face detection, voice recognition, object recognition, motion tracking, natural language processing (NLP), sentiment analysis, scene classification, and more. This option offers a relatively easy, ‘SaaSy’ version of AI/ML tools, with which users can upload troves of images, videos, or sound files and use the API to identify faces, locations, or sentiments on the spot [9]. According to the shared responsibility model, the accountability of using these models (with their potential biases (Noble, Citation2018) or other ramifications) is allegedly shared between the state and the cloud giants. Much of it suposedly remains ‘behind the scenes’ – black-boxed (Pasquale, Citation2015) beyond the government’s reach.

From the cloud webinars, it seemed that the cloud’s AI tools sparked the interest of government workers in the crowd. For example, when presenting the cloud's built-in NLP tools, GCP’s representatives uncharacteristically interrupted their presentation, saying there were many questions in the chat regarding the ability to use these tools to analyze other languages beyond English (most probably referring to Hebrew and Arabic) [9]. Similarly, after presenting a voice detection tool, a GCP representative read a question from the audience: ‘Can [people's] sentiment be identified using voice recognition? I mean, [determining] if a person is excited, lying, and so on?’ Yaron, Google's representative, answered:

‘It somewhat scares me to answer this question. No, [laughs]. You see, in principle, yes. The short answer is yes. The long answer – as you have noticed, there are various products here [offered in the cloud]. Often, these products are chained together. This means taking Speech to Text, converting it into text, and then understanding what is happening within the text. I don't have a system that can identify if someone is lying based on … I don't know … the tone of their voice. […] I don't even know how to define lies.

A second Google representative interrupted:

I’d like to clarify the answer. Yaron presented here several ready-made APIs – for example, an API for video analysis. You don't need to do anything. Just upload the video […], and we already have a built-in ML API for identifying entities in the video, identifying anomalies in the video, identifying faces in the video, and so on. Assuming you have the relevant data, it is possible to use Google's infrastructure to train a model that identifies the probability of someone lying based on their voice. Again, to do that, we would need lots of data from people who are lying, lots of data from people who are not lying, and [a definition] of what exactly is lying. To achieve something like that on your own, you’d need extremely powerful computer farms. One of the advantages of the cloud is that you can access and utilize such large amounts of computing power easily, and they are readily available.

Yaron: […] As we said, we have a line of products, each of which can handle a specific use case. I have a pipeline that can take what I say and convert it into text. There is a product that can take text and do something with it, perform sentiment analysis. […] I have never given such a solution, but it may be possible to solve it there. [9].

While the question from the audience revolves around a questionable use of AI (creating an AI-powered lie detector) that raises heightened concerns when it comes to citizen data, GCP’s representatives quickly offered practical solutions to the problem based on the cloud’s various tools and affordances. Beyond mentioning the computational power afforded by the cloud, they proposed to chain together various GCP SaaS applications to try to solve this issue while sketching a technical route to train an ML algorithm to achieve this task. This explanation aims to demonstrate the simplicity of using cloud-based applications for analyzing vast data sets, but it is also offered as part of an all-encompassing techno-solutionist approach (Morozov, Citation2013) – one that sees every social problem as one that should and can be solved algorithmically using the cloud providers’ tools.

Thus, the DCaaSization of the state potentially disseminates powerful AI tools to its various branches. While such tools are known to cause various algorithmic harms, we are currently not aware of any AI ethics courses that are included in Nimbus’s robust training sessions, nor of any ‘ethics owners’ (Metcalf et al., Citation2019) or other government employees that are specifically in charge of dealing with problematic AI uses. Moreover, the DCaaSization of the state not only disseminates commanding and potentially problematic dataveillance tools, but it seems to come with some prominent Silicon Valley values, such as data-ism (van Dijck, Citation2014) and tech-solutionism (Morozov, Citation2013), which might find their way into the governments’ corridors. Of course, some branches of the state have been using similar AI-powered tools for decades, specifically the state’s intelligence agencies. Nevertheless, the cloudification of the state potentially offers these tools to any government office, most of which could not and would not have ever gone down such paths.Footnote5

Conclusion

Project Nimbus is one of the biggest informational infrastructure projects Israel has known. Constructing six massive, resource-hungry corporate data centers at the heart of Israel and ‘migrating’ the entire state into GCP’s and AWS’s clouds is a colossal undertaking, one that can redraw the relationship between the state and big tech, as well as between the state and its citizens.Footnote6 As we have shown above, the cloudification of the state stems from a starkly neoliberal logic that, following the ‘cloud responsibility model,’ actively delegates state responsibilities towards its data to big tech. This move is similarly underpinned by market-oriented discourses that treat the state’s different branches as profit-seeking, agile, and flexible tech companies rather than governmental bodies responsible for their citizens’ welfare. We have further shown that with Nimbus, the state is obliged to restructure and standardize its various databases according to the tech giants’ standards. We argued that the effort and organizational changes such restructuring entails translate into acute dependence of the state on the tech giants in the form of an epistemic capture. This becomes particularly acute when different agencies potentially decide to step up their commitment to utilize ML and more advanced AI techniques, resulting in the corporate cloud’s unprecedented power over the inference of state-level decisions from civic data.

Highlighting the DCaaSization of the state, we argued that the reliance on a DCaaS (‘data center as a service’) model further underscores the state’s delegation of responsibility and heightened dependence on the tech giants by turning various governmental services into SaaS ‘managed services’ – namely, services that are run by and are in the responsibility of the cloud empires, while the state's ‘behind the scenes’ is fading into the tech giants’ black-boxed backstage. We also showed that the cloudification of the state translates into the dissemination of advanced AI-powered data processing tools (alongside data-ist ideology (van Dijck, Citation2014)), once only available to particular branches of government.

A central premise of data centers is their supposed security. Often found in remote and well-defended locations, such as ex-nuclear bunkers or natural caves, data centers often focus on storage and computation on demand, with the responsibility over security and latency passing from the client to the provider. This ties into two key tenets of infrastructural arrangement: spatial re-distribution with simultaneous re-centralization and enclosure of control (Gekker & Hind, Citation2020; Plantin, Citation2018). Security considerations rank high both in the government's advertising of Nimbus and the cloud providers’ advertising towards potential local users. However, those are part and parcel of the bind we describe throughout the article. While competing globally on pricing and offerings (Narayan, Citation2022), global data center providers share glossy figures and invest in hip advertising while disclosing very little of their server configuration, development plans, or capabilities. In this, the rise of remote computing has led to abstraction: ‘transparency as immaterialized in ‘the cloud’ has turned into an all-purpose political metaphor for the fact that we are storing our data (or our company’s data) on someone else’s servers in an undisclosed location that we will never be able to see.’ (Holt & Vonderau, Citation2015, p. 75). Nimbus might (somewhat) disclose these locations, but the same security reasons might also significantly limit the client’s abilities to interject with them. Israeli civic data – and its AI derivatives – is now maintained by foreign corporations. Recalling the inherent coloniality of data infrastructure (Cowen, Citation2020), we should ask: How does data colonialism (Couldry & Mejias, Citation2019) manifest in the corporate cloudification of the state?

Like other colonial infrastructures, the construction of AWS’s and GCP’s data centers, with the arduous migration, stringent standardization, and organizational and epistemic restructuring, translates into the state’s acute dependence on foreign cloud empires. Colonial infrastructures, and specifically knowledge infrastructures (like classification systems, social sorting mechanisms, and more (Bowker & Star, Citation2000)) have played a vital role in the creation and sustainment of colonial empires, and many of these infrastructures (epistemic or otherwise) remain as institutional colonial legacies (Berda, Citation2013) to this day. Nevertheless, this cloud’s data-colonialist power might still seem innocuous – after all, it is merely making Israel’s knoweldge infrastructures faster, more reliable, more connected, and more ‘modern.’

However, considering the flow of Silicon Valley’s values like data-ism (van Dijck, Citation2014) and technological solutionism (Morozov, Citation2013) into government corridors and how its advanced dataveillance (van Dijck, Citation2014) tools (like face recognition or sentiment analysis) are offered to every government office as a ‘managed service,’ highlights the potential ramifications of these powers. Moreover, the fact that AWS’s, GCP’s, and other third-party companies’ representatives take part in devising the government office’s cloud architectures almost inevitably puts these companies’ employees at strategic locations, as they potentially take part in crucial governmental decision-making. Nimbus’s functional irreversibility (Lampland & Star, Citation2009) and acute dependence are more than a financial consideration, as cloud providers can potentially alter their deals with clients (for example, Google recently announced the surprise sell-off of its domain management system to Squarespace (Quach, Citation2023)), or withdraw from the country entirely (as in the suspension of Google’s operations in Russia following its attack on Ukraine). This is not an unlikely scenario for a country involved in multiple active ethnonational conflicts currently threatened by a dramatic regime change and a recent war.Footnote7 As mentioned above, Israel’s state offices pay for cloud services on a monthly basis in US$. Dramatic devaluations in the Israeli currency (which often come with geopolitical shocks) might make this deal much less feasible for the state. Hence, even if cloudification is not centered around data extraction, it includes the exertion of epistemic, data-colonialist power in various ways.

Nevertheless, this power structure is anything but unidimensional. While Amazon and Google clearly stand to gain from adding the Nimbus data centers to their global networks, Israel is also poised to benefit from its corporate-based cloudification, even if it may lose some of its sovereignty, autonomy, and responsibility toward its citizens’ data, in contrast to EU attempts in reversing these trends (Pannier, Citation2021). First, aligning with the neoliberal logic mentioned above, the state views its cloudification as a means to significantly reduce costs related to data storage, data security, and data management compared to traditional on-premises systems. This privatized cloud solution is perceived as an avenue to save the state millions while providing more robust, secure, and updated solutions. More broadly, as we have seen, Nimbus is regarded as an opportunity to enhance the efficiency of the state’s bureaucratic machinery, which, like many governmental bureaucracies, is often seen as slow, outdated, and inefficient. In contrast, tech giants like Google and Amazon are seen as the epitome of modernity, efficiency, and innovation, promising to impart some of these qualities to their state-run partners. Symbolically, Nimbus's cloud migration also presents an opportunity for the state to align more closely with Israel's nearly mythical tech sector, known for its achievements, grandeur, and relentless innovation (Senor & Singer Citation2009). Moreover, Nimbus allegedly seeks to improve the state's connectivity with this sector, enabling startups to more easily collaborate with government agencies, access their systems, utilize their data, offer demos, and more.

Therefore, while some activists mistakenly portray Project Nimbus as one that primarily enhances Israel's surveillance capabilities or technologically reinforces the occupationFootnote8, it appears the project's goals are much more mundane. The IDF and other Israeli security forces might utilize Nimbus and benefit from its offerings (for instance, startups might find it easier to collaborate with the IDF), but these entities, being particularly affluent and influential parts of the government, seem to be the least in need of AWS's or GCP's GPUs or other cloud services. In other words, The Nimbus Project and the attempt to cloudify an entire state might seem like a starkly Israeli Project, but not due to Israeli politics, its war-ridden reality or the occupation over the Palestinian territories. Instead, these clouds seem to have settled in Israel due to its small size (22,145 sq.km with a population of 9 million), its strategic geographic location (between Europe and Asia, and in the middle of the tech giants’ cloud empires), and its loose and potentially haphazard bureaucratic and political culture that made such project possible. If the data colonialism metaphor is to be taken fully, it seems that just like before, different social strata respond differently to colonization, with business and technological elites often benefiting from the change.

Lasly, the DCaaSization of the state also underscores the dual status of data centers as both infrastructures and platforms. As the GCP representative mentioned in one of the webinars, moving from the relatively tame IaaS through the PaaS and all the way to SaaS – from infrastructure to software and back. This further entrenches the epistemic status of Nimbus within the national tech-sphere, with its data centers are both originators and users of the cloud. Depending on different arrangements and tiers, cloud providers can act as custodians only or take active participation in clients’ daily operations by allowing a secondary marketplace of specialized software providers. ‘Analytically, it is important to bridge the growing gap between the analysis of data centers and their materiality with that of software development. This configuration of centralized and aggregated hardware directly transforms the realm of software production.’ (Narayan, Citation2023, p. 292). This can also lead to governments adopting one type of cloud over another, particularly as a factor of trust in the provider’s reliability (Liang et al., Citation2017). We must, therefore, continuously perform the complicated conceptual bifocalism of infrastructures and platforms when examining further DCaaSization of the state (and other states): between the local and global, extractive and permitting, locked-in and expansive, stand those boring, mundane data centers.

Supplemental material

Supplemental Material

Download MS Word (19.4 KB)

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Dan M. Kotliar

Dan Kotliar is a lecturer (Assistant Professor) at the department of Sociology, University of Haifa. His research explores the ties between algorithms and culture from a qualitative, critical perspective. His more recent projects include algorithmic production in Israel and Silicon Valley, Israeli surveillance firms, data infrastructures, and the ties between AI, science, and startup culture. Kotliar has published in leading journals such as New Media & Society, Information, Communication & Society, Theory & Society, Science, Technology & Human Values, and others [email: [email protected]].

Alex Gekker

Alex Gekker is an Assistant Professor in Digital Research Methods, Media Studies department, University of Amsterdam. His research incorporates various aspects of digital media, primarily focusing on platforms and interfaces to analyse maps, surveillance assemblages, autonomous cars, videogame ecosystems and more. He published in New Media & Society, Social Media + Society, American Behavioral Scientist, Surveillance and Society, and Geoforum. He co-edited two Open Access books on mapping, one on temporality and the other on play [email: [email protected]].

Notes

1 The tender requires both companies to commit to launching a local 'cloud region' (which aims to increase Israel’s information security and make Israel less vulnerable to global Internet outages), and to guaranteeing service to all branches of the government regardless of their workers’ objections or protests. The cloud regions are expected to provide considerable economic benefits through at least $4 billion investment by the tech giants and the creation of approximately 3,000 jobs (Taitelbaum, Citation2021).

2 These discourses similarly echo contemporary neoliberal trends of the ‘agile government’ (Gillies, Citation2011).

3 Nimbus’s 5th and last tender includes the creation of an online marketplace where other companies would offer their SaaS products over the Nimbus clouds. Currently, these clouds only include SaaS solutions by the respective cloud – AWS or GCP.

4 Like other large institutions, the Israeli government has been utilizing software from tech giants for years, particularly their SaaS products. A notable example is Microsoft's Office 365 suite, which is marketed as a corporate SaaS package and is widely used in Israeli government offices. However, with such SaaS solutions, the relationship between the state and big tech typically ends at the government employee's desk (and possibly, with the extraction of some of their data). Nimbus, on the other hand, operates on a much broader scale as it involves citizen data, along with various other public, statewide types of data. Thus, the enormity of Project Nimbus extends beyond the physical presence of its data centers, as it also encompasses an unprecedented flow of data from state offices to corporate clouds.

5 It should be noted that in Israel, public discussion on Nimbus is notably sparse. Contrary to the situation in countries like the Netherlands (Rone, Citation2023), where citizen groups actively oppose similar projects on environmental, political, or other grounds, such opposition is largely absent in Israel. Moreover, although the press has been reporting on Nimbus, it generally does so without significant criticism. Accordingly, the project is not associated with any specific political party or leaders, and the neoliberal logic underpinning it is widely accepted across the Israeli political spectrum. Consequently, Nimbus does not spark significant political opposition, resistance from local activists, or scrutiny from journalists, other than occasional concerns regarding the security implications of the data centers locations.

6 While Project Nimbus explicitly aims to migrate the ‘entire state’ into corporate clouds, the reality is more nuanced. Because each government office should independently join Nimbus, on its own terms, and according to its own schedule (some have requested to postpone their move), and while private businesses are not obliged to join, the true scope and the real limits of this transfer remain to be seen. Moreover, some data might be deemed too confidential to upload to a cloud, while other data might be stored too deep inside a dusty, long-forgotten governmental basement.

7 Nimbus’s (confidential) contracts reportedly contain a clause that forbids AWS or GCP from breaking the contract for political reasons (Taitelbaum, Citation2021). Although corporations of this magnitude can break contracts and face the necessary damages when the right (geo)political justification arises, so far, five months into the Israel-Hamas war, with mounting international criticism of Israel, the relationship between Israel and the Nimbus tech giants seems intact.

References

  • Agre, P. E. (1994). Surveillance and capture: Two models of privacy. The Information Society, 10(2), 101–127. https://doi.org/10.1080/01972243.1994.9960162
  • Amoore, L. (2020). Cloud ethics: Algorithms and the attributes of ourselves and others. Duke University Press.
  • Appadurai, A. (1996). Modernity at large: Cultural dimensions of globalization. University of Minnesota Press.
  • Appadurai, A. (2001). Globalization. Duke University Press Books.
  • Avnoon, N., Kotliar, D. M., & Rivnai-Bahir, S. (2023). Contextualizing the ethics of algorithms: A socio-professional approach. New Media & Society, https://doi.org/10.1177/14614448221145728
  • Berda, Y. (2013). Managing dangerous populations: Colonial legacies of security and surveillance. Sociological Forum, 28(3), 627–630. https://doi.org/10.1111/socf.12042
  • Bowker, G. C., & Star, S. L. (2000). Sorting things out: Classification and its consequences (W. E. Bijker, W. B. Carlson, & T. Pinch Eds.; Revised edition). The MIT Press.
  • Braedley, S., & Luxton, M. (2010). Neoliberalism and everyday life. McGill-Queen’s Press.
  • Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
  • Brodie, P. (2020). Climate extraction and supply chains of data. Media, Culture & Society, 42(7-8), 1095–1114. https://doi.org/10.1177/0163443720904601
  • Burrell, J. (2020). On half-built assemblages: Waiting for a data center in Prineville, Oregon. Engaging Science, Technology, and Society, 6, 283–305. https://doi.org/10.17351/ests2020.447
  • Busch, L. (2011). Standards: Recipes for reality. Infrastructures series. MIT Press.
  • Callon, M. (1998). Introduction: The embeddedness of economic markets in economics. The Sociological Review, 46(Suppl. 1), 1–57. https://doi.org/10.1111/j.1467-954X.1998.tb03468.x
  • Chonka, P., Diepeveen, S., & Haile, Y. (2023). Algorithmic power and African indigenous languages: search engine autocomplete and the global multilingual Internet. Media, Culture & Society, 45(2), 246–265. https://doi.org/10.1177/01634437221104705
  • Couldry, N., & Mejias, U. A. (2019). Data colonialism: Rethinking big data’s relation to the contemporary subject. Television & New Media, 20(4), 336–349. https://doi.org/10.1177/1527476418796632
  • Couldry, N., & Mejias, U. A. (2023). The decolonial turn in data and technology research: what is at stake and where is it heading?. Information, Communication & Society, 26(4), 786–802. http://doi.org/10.1080/1369118X.2021.1986102
  • Cowen, D. (2020). Following the infrastructures of empire: Notes on cities, settler colonialism, and method. Urban Geography, 41(4), 469–486. https://doi.org/10.1080/02723638.2019.1677990
  • Edwards, P., Bowker, G., Jackson, S., & Williams, R. (2009). Introduction: An agenda for infrastructure studies. Journal of the Association for Information Systems, 10(5), 364–374. https://doi.org/10.17705/1jais
  • Elder, R. B. (2021). Communication, place, nation, media: Electricity reunites the sundered world of industry. American Review of Canadian Studies, 51(3), 458–474. https://doi.org/10.1080/02722011.2021.1947653
  • Eubanks, V. (2017). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin’s Press.
  • Furlong, K. (2021a). Geographies of infrastructure II: Concrete, cloud and layered (in)visibilities. Progress in Human Geography, 45(1), 190–198. https://doi.org/10.1177/0309132520923098
  • Furlong, K. (2021b). Geographies of infrastructure III: Infrastructure with Chinese characteristics. Progress in Human Geography, 46(3), 915–925. https://doi.org/10.1177/03091325211033652
  • Gekker, A., & Hind, S. (2020). Infrastructural surveillance. New Media & Society, 22(8), 1414–1436. https://doi.org/10.1177/1461444819879426
  • Gillies, D. (2011). Agile bodies: A new imperative in neoliberal governance. Journal of Education Policy, 26(2), 207–223. https://doi.org/10.1080/02680939.2010.508177
  • Gitelman, L. (2013). Raw data’ is an oxymoron. MIT Press.
  • Goujard, C., & Cerulus, L. (2021, October 26). Inside gaia-X: How chaos and infighting are killing Europe’s grand cloud project. POLITICO. https://www.politico.eu/article/chaos-and-infighting-are-killing-europes-grand-cloud-project/
  • Helles, R., & Flyverbom, M. (2019). Meshes of surveillance, prediction, and infrastructure: On the cultural and commercial consequences of digital platforms. Surveillance & Society, 17(1/2), 34–39. https://doi.org/10.24908/ss.v17i1/2.13120
  • Hesmondhalgh, D. (2021). The infrastructural turn in media and internet research. In P. McDonald (Ed.), The Routledge companion to media industries (pp. 132–142). Routledge. https://eprints.whiterose.ac.uk/168831/
  • Hogan, M. (2015). Data flows and water woes: The Utah data center. Big Data & Society, 2(2), 205395171559242. https://doi.org/10.1177/2053951715592429
  • Holt, J., & Vonderau, P. (2015). “Where the internet lives”: Data centers as cloud infrastructure. In L. Parks, & N. Starosielski (Eds.), Signal traffic (pp. 71–93). University of Illinois Press.
  • Hughes, T. P. (1993). Networks of power: Electrification in western society, 1880-1930. John Hopkins Univ. Press.
  • Ilcan, S. (2009). Privatizing responsibility: Public sector reform under neoliberal government. Canadian Review of Sociology/Revue Canadienne de Sociologie, 46(3), 207–234. https://doi.org/10.1111/j.1755-618X.2009.01212.x
  • Johnson, A. (2019). Data centers as infrastructural in-betweens. American Ethnologist, 46(1), https://doi.org/10.1111/amet.12735
  • Kalema, N. (2023). Deconstructing the global coded gaze on digital transformation. Anti-Racism Policy Journal, 2(2), Article 2.
  • Ken-Dror Feldman, D., Purian, R., Ben David, A., & Kadan, N. (2020). Invisible surveillance, indifferent publics: Israeli perceptions of voluntary contact tracing applications vs. mandatory general secret service surveillance during the COVID-19 pandemic. In Rethinking privacy and mass surveillance in the information age: Paper series. Israel Public Policy Institute and Heinrich Böll Foundation.
  • Knorr-Cetina, K. (1999). Epistemic cultures: How the sciences make knowledge. Harvard University Press.
  • Lampland, M., & Star, S. L. (2009). Reckoning with standards. In standards and their stories: How quantifying, classifying, and formalizing practices shape everyday. Cornell University Press.
  • Lannelongue, L., Grealey, J., & Inouye, M. (2021). Green algorithms: Quantifying the carbon footprint of computation. Advanced Science, 8(12), https://doi.org/10.1002/advs.202100707
  • Latour, B. (1996). Aramis, or the love of technology. Harvard University Press.
  • Lehuedé, S. (2022). Territories of data: Ontological divergences in the growth of data infrastructure. Tapuya: Latin American Science, Technology and Society, 5, https://doi.org/10.1080/25729861.2022.2035936
  • Leigh Star, S. (1999). The ethnography of infrastructure. American Behavioral Scientist, 43(3), 377–391. https://doi.org/10.1177/00027649921955326
  • Levenda, A., & Mahmoudi, D. (2019). Silicon forest and server farms: The (urban) nature of digital capitalism in the Pacific northwest. Culture Machine, 1–15. https://culturemachine.net/wp-content/uploads/2019/04/LEVENDA-MAHMOUDI.pdf
  • Liang, Y., Qi, G., Wei, K., & Chen, J. (2017). Exploring the determinant and influence mechanism of E-government cloud adoption in government agencies in China. Government Information Quarterly, 34(3), 481–495. https://doi.org/10.1016/j.giq.2017.06.002
  • Lyon, D. (2003). Surveillance as social sorting: Computer codes and mobile bodies. Routledge.
  • Lyon, D. (2023). Surveillance and the power of platforms. Cambridge Journal of Regions, Economy and Society, 16(2), 361–365. https://doi.org/10.1093/cjres/rsad006
  • Maron, A., & Shalev, M. (2017). Neoliberalism as a state project: Changing the political economy of Israel. Oxford University Press.
  • Mattern, S. (2015). Deep time of media infrastructure. In L. Parks, & N. Starosielski (Eds.), Signal traffic (pp. 94–112). University of Illinois Press.
  • McKelvey, F., & Neves, J. (2021). Introduction: Optimization and its discontents. Review of Communication, 21(2), 95–112. https://doi.org/10.1080/15358593.2021.1936143
  • Mergel, I., Ganapati, S., & Whitford, A. B. (2021). Agile: A new way of governing. Public Administration Review, 81(1), 161–165. https://doi.org/10.1111/puar.13202
  • Metcalf, J., Moss, E., & boyd, d. (2019). Owning ethics: Corporate logics, Silicon Valley, and the institutionalization of ethics. Social Research: An International Quarterly, 86(2), https://doi.org/10.1353/sor.2019.0022
  • Milan, S., & Treré, E. (2021). Big data from the south(s): An analytical matrix to investigate data at the margins. In D. Rohlinger, & S. Sobieraj (Eds.), The Oxford handbook of sociology and digital media (pp. 1–21). Oxford University Press.
  • Morozov, E. (2013). To save everything, click here: The folly of technological solutionism. Public Affairs.
  • Munn, L. (2022). Thinking through silicon: Cables and servers as epistemic infrastructures. New Media & Society, 24(6), 1399–1416. https://doi.org/10.1177/1461444820977197
  • Narayan, D. (2022). Platform capitalism and cloud infrastructure: Theorizing a hyper-scalable computing regime. Environment and Planning A: Economy and Space, 54(5), 911–929. https://doi.org/10.1177/0308518X221094028
  • Narayan, D. (2023). Monopolization and competition under platform capitalism: Analyzing transformations in the computing industry. New Media & Society, 25(2), 287–306. https://doi.org/10.1177/14614448221149939
  • Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. NYU Press.
  • Ortiz Freuler, J. (2023). The weaponization of private corporate infrastructure: Internet fragmentation and coercive diplomacy in the 21st century. Global Media and China, 8(1), 6–23. https://doi.org/10.1177/20594364221139729
  • Pannier, A. (2021, July 22). The changing landscape of European cloud computing: Gaia-X, the French national strategy, and EU plans. https://www.ifri.org/en/publications/briefings-de-lifri/changing-landscape-european-cloud-computing-gaia-x-french-national
  • Parks, L. (2015). Stuff you can kick’. Toward a theory of media infrastructures. In P. Svensson, & D. T. Goldberg (Eds.), Between humanities and the digital (pp. 355–373). The MIT Press.
  • Parks, L., & Starosielski, N. (2015). Introduction. In L. Parks, & N. Starosielski (Eds.), Signal traffic (pp. 1–28). University of Illinois Press.
  • Pasek, A. (2019). Managing carbon and data flows: Fungible forms of mediation in the cloud. Culture Machine, 1–15.
  • Pasquale, F. (2015). The black box society. Harvard University Press.
  • Peters, J. D. (2015). The marvelous clouds: Toward a philosophy of elemental media. The University of Chicago Press.
  • Plantin, J.-C. (2018). Digital traces in context| google maps as cartographic infrastructure: From participatory mapmaking to database maintenance. International Journal of Communication, 12, 18.
  • Plantin, J.-C., Lagoze, C., Edwards, P. N., & Sandvig, C. (2018). Infrastructure studies meet platform studies in the age of Google and Facebook. New Media & Society, 20(1), 293–310. https://doi.org/10.1177/1461444816661553
  • Plantin, J.-C., & Punathambekar, A. (2019). Digital media infrastructures: Pipes, platforms, and politics. Media, Culture & Society, 41(2), 163–174. https://doi.org/10.1177/0163443718818376
  • Poell, T., Nieborg, D., & van Dijck, J. (2019). Concepts of the digital society: Platformisation. Internet Policy Review, 8(4), https://doi.org/10.14763/2019.4.1425
  • Quach, K. (2023). Google domains to shut down, bought up by squarespace. The Register. https://www.theregister.com/2023/06/18/google_domains_shutting_down
  • Rone, J. (2023). The shape of the cloud: Contesting data centre construction in North Holland. New Media & Society, January, https://doi.org/10.1177/14614448221145928
  • Rossiter, N. (2015). Coded vanilla: Logistical media and the determination of action. South Atlantic Quarterly, 114(1), 135–152. https://doi.org/10.1215/00382876-2831334
  • Sandvig, C. (2015). The internet as the anti-television. In L. Parks & N. Starosielski (Eds.), Signal traffic (pp. 225–245). Critical Studies of Media Infrastructures. University of Illinois Press.
  • Senor, D., & Singer, D. (2009). Start-up nation: The story of Israel’s economic miracle. Grand Central Publishing.
  • Srnicek, N. (2016). Platform capitalism. Polity.
  • Taitelbaum, S. (2021, May 24). נחתמו החוזים: פרויקט נימבוס הממשלתי יעסיק כ−3,000 עובדים. [The contracts have been signed: Government’s nimbus project to employ approximately 3000]. Calcalist. https://www.calcalist.co.il/calcalistech/article/SJKCfMYYd.
  • Taylor, A. R. E. (2021). Future-proof: Bunkered data centres and the selling of ultra-secure cloud storage. Journal of the Royal Anthropological Institute, 27(S1), 76–94. https://doi.org/10.1111/1467-9655.13481
  • Thatcher, J., O’Sullivan, D., & Mahmoudi, D. (2016). Data colonialism through accumulation by dispossession: New metaphors for daily data. Environment and Planning D: Society and Space, 34(6), 990–1006. https://doi.org/10.1177/0263775816633195
  • van Dijck, J. (2014). Datafication, dataism and dataveillance: Big data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208. https://doi.org/10.24908/ss.v12i2.4776
  • Velkova, J. (2023). Retrofitting and ruining: Bunkered data centers in and out of time. New Media & Society, 25(2), 431–448. https://doi.org/10.1177/14614448221149946
  • Velkova, J., & Plantin, J.-C. (2023). Data centers and the infrastructural temporalities of digital media: An introduction. New Media & Society, 25(2), 273–286. https://doi.org/10.1177/14614448221149945
  • Winseck, D. (2019). Internet infrastructure and the persistent myth of U.S. Hegemony. In B. Haggart, K. Henne, & N. Tusikov (Eds.), Information, technology and control in a changing world: Understanding power structures in the 21st century, international political economy series (pp. 93–120). Springer International Publishing.
  • Wodak, R., & Meyer, M. (2009). Critical discourse analysis: History, agenda, theory and methodology. In R. Wodak & M. Meyer (Eds.), Methods of critical discourse studies. (pp. 1–33). Sage.
  • Zhang, Z. (2021). Infrastructuralization of Tik Tok: Transformation, power relationships, and platformization of video entertainment in China. Media, Culture & Society, 43(2), 219–236. https://doi.org/10.1177/0163443720939452
  • Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75–89. https://doi.org/10.1057/jit.2015.5