4,650
Views
0
CrossRef citations to date
0
Altmetric
Strategic Stability in the 21st Century

False Sense of Supremacy: Emerging Technologies, the War in Ukraine, and the Risk of Nuclear Escalation

&
Pages 28-46 | Received 04 Oct 2022, Accepted 25 May 2023, Published online: 28 Jun 2023

ABSTRACT

How will emerging technologies impact crisis escalation? What has been the escalatory – or de-escalatory – effect of emerging technologies in contemporary crises? And can the use of emerging technologies increase risks of nuclear use? To answer these questions, we use the ongoing war in Ukraine as a case study to identify how emerging technologies are being used in modern conflicts and the associated risks of escalation, potentially to include nuclear use. We argue that emerging technologies gave Russia a false sense of supremacy in the lead-up to the war in Ukraine and have largely failed to deliver Russia battlefield victories. As a result, Moscow has increased reliance on nuclear weapons and nuclear threats. This reliance could be exacerbated in the aftermath of the war in Ukraine when Russia is conventionally weakened. Therefore, it is not the technologies themselves that increase risks of escalation, but their impact on decisionmakers’ perceptions of the potential costs of offensive military operations and escalation. Nonetheless, the role of emerging technologies in Ukraine should not inspire complacency because of the impact of new actors, new escalation pathways, and compressed timescales. These trends will have implications for nuclear policy and require more inclusive approaches to risk reduction and arms control, to include an increased focus on behaviors rather than capabilities.

Introduction

On 3 May 2023, two drones exploded over the Kremlin in what Russian officials claimed was an assassination attempt on President Vladimir Putin. US Secretary of State Anthony Blinken dismissed these claims and said that anything coming out of the Kremlin should be taken with “a very large salt shaker” (Tannehill Citation2023). Drones and other modern military weapons have been widely employed throughout the war in Ukraine, and their use has raised questions about the wider impacts of emerging technologies on crisis escalation. Many studies examine single technologies in isolation, such as the impact of AI on nuclear decision-making, rather than the intersection of various military capabilities. This relatively narrow focus often comes at the expense of understanding the wider influence of emerging technologies on the nature of warfare, to include potential nuclear use. Events in Ukraine offer a first glimpse into whether these capabilities signal a fundamental shift in strategic stability, to include risks of crisis escalation, or whether the hype around emerging technologies is just background noise.

We use a definition of emerging technologies from the University of Hamburg, to include “those technologies, scientific discoveries, and technological applications that have not yet reached maturity or are not widely in use but are anticipated to have a major – perhaps disruptive – effect on international peace and security” (Favaro, Reinic, and Kuhn Citation2022). At the outset, it is important to acknowledge the definitional challenges of “emerging technologies” and ambiguity around when a technology reaches “maturity” and whether or not it has a “disruptive” effect.Footnote1 This definitional issue is particularly challenging against the backdrop of geopolitical and technological competition as states are rapidly advancing their arsenals to incorporate technologies such as artificial intelligence (AI). For the purposes of this paper, we are focused on technologies or technological applications that are still evolving in the context of their impact on international conflict. Essentially, we are tracking a moving target.

How will emerging technologies impact crisis escalation? What has been the escalatory – or de-escalatory – effect of emerging technologies in contemporary crises? And can the use of emerging technologies increase risks of nuclear use? To answer these questions, we use the ongoing war in Ukraine as a case study to identify how emerging technologies are being used, their impact on tactical and strategic developments, and associated risks of escalation, potentially to include nuclear use. While only Russia possesses nuclear weapons in the crisis, we nonetheless consider it to be a justifiable case study because nuclear weapons have been a strategic consideration throughout the conflict. According to the New York Times, Russian military leaders discussed potentially using tactical nuclear weapons in fall 2022 as Russia was losing on numerous fronts (Cooper, Barnes, and Schmitt Citation2022). This, along with Putin’s regular nuclear rhetoric, demonstrates that nuclear weapons can play a role in conflicts with a non-nuclear adversary and still run the risk of escalating to nuclear use. Additionally, NATO has increasingly supported Ukraine in the conflict and includes nuclear possessors, who provide extended nuclear deterrence to members of the alliance. While it has avoided direct military intervention or meeting Russia on the battlefield, NATO’s nuclear umbrella and conventional superiority have, arguably, deterred Russia from expanding its military operations beyond Ukraine.

We argue that emerging technologies gave Russia a false sense of supremacy in the lead-up to the war in Ukraine and have largely failed to deliver Russia battlefield victories. As a result, Moscow has increased reliance on nuclear weapons and nuclear threats. This reliance could be exacerbated in the aftermath of the war in Ukraine when Russia is conventionally weakened. Therefore, it is not the technologies themselves that increase risks of escalation, but their perceived military contributions that could lead states to initiate or escalate conflicts. The limited role of emerging technologies on the battlefields in Ukraine should not inspire complacency. Instead, these technologies and their military applications provide a preview of ways that emerging technologies could contribute to escalation in the future both before and during conflicts.

This article proceeds in three parts. First, we explore various theories on the intersection of emerging technologies and crisis escalation and develop an analytical framework for assessing the potential escalatory impact of emerging technologies. This framework relies on concepts of wormhole escalation, inadvertent escalation, and cost tolerance escalation. Second, we examine the empirical evidence on the role of emerging technologies in Ukraine. We focus on hypersonics, cyber operations, deep fakes, drones, enterprise AI, and anti-satellite capabilities, which are the most novel technologies to be used in the war but have not yet reached maturity and have potentially disruptive effects. Using the analytic framework, we conclude that emerging technologies do not lead to escalation on the battlefield, as often feared in escalation theories; rather they can lead to escalation by giving states a false sense of supremacy, as seems to have been in the case in Russia’s invasion of Ukraine. Nonetheless, we identify three ways that emerging technologies could have had a more direct escalatory impact during crises and offer policy recommendations for mitigating these risks.

Theories of Escalation and Emerging Technologies

Risks associated with military technology and crisis escalation are not a new phenomenon for the current age. Traditionally defined, crisis stability refers to a situation in which neither side has an incentive to use nuclear weapons first, out of fear that the other side might retaliate in-kind (Schelling 1966, 234). Another way to approach crisis stability is as scenarios in which “emotion, uncertainty, miscalculation, misperception, or the posture of forces” do not incentivize leaders to strike first, and incentivize leaders “to avoid the worse consequences of incurring a first strike” (Kent and Thaler Citation1990). This definition emphasizes that, in addition to technical characteristics, other psychological, political, and strategic factors are also relevant to crisis stability. This approach also treats crisis stability more broadly to include reduced incentives to initiate a conflict in the first place, not just relating to nuclear use. Bruusgaard and Kerr (2020) highlight an additional complication to the contemporary definition for crisis stability by adding that “the current information environment presents additional challenges for retaining stability in crisis”. This includes new tools of dis- and misinformation and an abundance of unverified data that is available to decision-makers. Conversely, escalation risks refer to factors – such as mistakes and miscalculation, or crisis mismanagement – that could lead to nuclear use (Hersman, Claeys, and Williams Citation2022). Potential escalation risks are becoming more complicated, to include not only technological factors, but also informational and psychological ones.

Arms racing also presents its own set of escalation risks, due to unregulated competition. Acton (Citation2013, 121) defines arms race stability as the absence of perceived or actual incentives to build up nuclear forces out of fear that in a crisis an opponent would gain a meaningful advantage by using nuclear weapons first. Schelling and Halperin (Citation1961) noted during the Cold War, “the present arms race seems unstable because of the uncertainty in technology and the danger of a decision break-through. Uncertainty means that each side must be prepared to spend a great deal of money; it also means a constant fear on either side that the other has developed a dominant position, or will do so, or will fear to do so, with the resulting danger of premeditated or pre-emptive attack” (p. 37). More recently, Gottemoeller coined the term “the standstill conundrum” (2021) to refer to the ability of big data analytics and new sensors to render submarines and mobile missiles (i.e. guarantors of second-strike capabilities) vulnerable to detection. New technologies, therefore, could give a real or perceived advantage to offense, by increasing first-strike incentives, but also to defense by potentially undermining legacy strategic systems.

Escalation pathways typically fall into three main categories: inadvertent, accidental, and intentional. Lin (Citation2012) provides a breakdown of these types: Inadvertent escalation is the result of one side taking deliberate actions that they believe are non-escalatory but are perceived as escalatory by the other side. Accidental escalation results when operational actions have direct unintended effects. And intentional escalation occurs when a side deliberately escalates a conflict to either gain advantage, pre-empt, signal, penalize, or avoid defeat (p. 53). H. Kahn (Citation1965) proposed that escalation in the Cold War would likely unfold in a series of quasi-predictable steps of increasing risk and intensity, culminating in a 44-rung escalation ladder. Within the escalation ladder there are opportunities for states to manage and reduce risks to prevent further escalation. However, as previously highlighted, escalation today may be more complex, as both states and non-state actors acquire increasingly capable and intrusive digital information technologies and advanced dual-use military capabilities.

Among the growing body of scholarship on emerging technologies, strategic stability, and crisis escalation, there are numerous schools of thought and approaches. From this literature, we identify three broad theories of escalation, some of which overlap: wormhole escalation, inadvertent escalation, and cost tolerance escalation. By no means are these exhaustive, but they do offer useful comparisons into thinking about how escalation happens. In using these theories to inform an analytical framework, we assume that states will continue to compete in technological innovation, and emerging technologies will have military applications. The war in Ukraine has demonstrated that the use of these technologies on the battlefield is a new military reality. The focus should not be on whether or not to employ them in warfare; instead, we should focus on how they are employed to explore if certain applications and activities are more escalatory than others. We also assume that technology is neutral; it is neither good nor bad, neither escalatory nor de-escalatory by nature. Rather, the impact of emerging technologies depends on a host of factors, including their applications, the actors involved, and the stakes at risk. Singer asserts that it is “very clear that technology, and in particular, new technologies matter, and have been incredibly important [in this war], but are they the only important thing? Of course not” (Guyer Citation2022). Indeed, we must recognize that there are elements of both continuity and change when examining the impact of emerging technologies on warfare.

Turning to three schools of thought about emerging technologies and escalation, the first, “wormhole escalation”, focuses on gray zone conflicts and the potential for a non-kinetic conflict to jump “rungs” on the escalation ladder in a non-linear fashion. As described by Hersman, “Holes may suddenly open in the fabric of deterrence through which competing states could inadvertently enter and suddenly traverse between sub-conventional and strategic levels of conflict in accelerated and decidedly non-linear ways”. Sub-conventional tactics, such as disinformation and influence campaigns; conflicts at the conventional-nuclear interface, such as a breakdown in escalation firebreaks; and non-linear strategic crises, particularly between regional nuclear actors, all have the potential for emerging technologies to contribute to escalation. With the return of great power competition, Russia and China are actively seeking ways to utilize their perceived asymmetric advantages, such as disinformation campaigns and autocratic decision-making, to challenge American hegemony and the rules-based international order. For example, the 2022 US National Defense Strategy points to a host of strategic challenges, all of which have escalation risks, to include: “emerging technologies; competitor doctrines that pose new threats to the US homeland and to strategic stability; an escalation of competitors’ coercive and malign activities in the “gray zone””, along with other factors. To summarize, wormhole escalation happens when emerging technologies prompt crises to quickly jump from one level of conflict, such as non-kinetic, to a higher level, and are non-linear at the regional or strategic levels. Such technologies might include disinformation campaigns, cyberattacks, or other capabilities that muddy the “fog of war”.

Second, inadvertent escalation was a concern during the Cold War, and has been revived in recent years largely thanks to the work of James Acton in identifying how the “entanglement” of cyber- and space-based capabilities could escalate crises. In his 1982 study on inadvertent escalation, Barry Posen cautioned:

[I]ntense conventional operations may cause nuclear escalation by threatening or destroying strategic nuclear forces. The operational requirements (or preferences) for conducting a conventional war may thus unleash enormous, and possibly uncontrollable, escalatory pressures despite the desires of American or Soviet policymakers. Moreover, the potential sources of such escalation are deeply rooted in the nature of the force structures and military strategies of the superpowers, as well as in the technological and geographical circumstances of large-scale East-West conflict.

(Posen 1982, 28–29)

In the course of conducting a conventional war, an aggressor could threaten attacks on nuclear forces, leading to inadvertent escalation by prompting a response. Lin describes inadvertent escalation as potentially occurring because of incomplete information, a lack of shared definitions, and communication breakdowns. Emerging technologies complicate this scenario because of their reshaping of what constitutes a “conventional” conflict and because of their integration with nuclear command and control. In the specific scenario of “entanglement”, Acton captures the risk that attacks on enemy nuclear forces, to include command and control, might not be deliberate but would be interpreted as such (Acton Citation2018b). To summarize, inadvertent escalation happens when a state targets, either intentionally or not, another’s strategic/nuclear assets leading to “use it or lose it” scenarios and forces a response. Emerging technologies can exacerbate inadvertent escalation by creating entanglement scenarios that threaten nuclear command and control or nuclear assets.

Third, a less explored theory of escalation in the context of emerging technologies is cost tolerance, which relies heavily on perceptions of military capabilities and the ability to impose and absorb costs. Unlike the other two theories that focus on how escalation happens once a conflict has begun, a theory of cost tolerance focuses on why states initiate a conflict in the first place. This approach is particularly relevant for questions about a crisis escalating to involve nuclear weapons, as any crisis involving a nuclear possessor inherently comes with the risk of nuclear use. Any conventional conflict, therefore, has an implicit risk of nuclear use being threatened or exercised. There are two main components to a cost tolerance theory of escalation. First, that states will initiate conflicts assuming low costs and/or resolve to see through their strategic intentions. Patricia Sullivan, for example, has posed the question, “Why Powerful States Lose Limited Wars”, and concluded, “strong states become more likely to underestimate the cost of victory as the impact of resolve increases relative to that of war-fighting capacity” (Sullivan 2007, 496). Sullivan focuses on asymmetric wars, wherein “strong states select themselves into armed conflicts only when their pre war estimate of the cost of attaining their political objectives through the use of force falls below the threshold of their tolerance for costs. The more the actual costs of victory exceed a state’s prewar expectations, the greater the risk that it will be pushed beyond its cost-tolerance threshold and forced to unilaterally withdraw its forces before it attains its war aims”. Emerging technologies could shape pre war estimates, perceptions about military balance, and the likelihood of success.

A second component to this theory of escalation is that actors decide to initiate conflicts with accurate information. States’ ability to conduct cost-benefit analyses and understanding of risk tolerance hinges on available information and intelligence, and its credibility. Lawrence Freedman has highlighted the risk of “distortion in political decision-making” and its influence on decisions to initiate military conflict in the first place (Freedman 1991). Our own research supports this conclusion: in a 2021 study drawing on expert surveys, we found that “technologies that distort” are most concerning in terms of nuclear risk, due to their potentially high impact and the high feasibility of their implementation (Favaro Citation2021). To summarize, a cost tolerance theory of escalation is concerned with why actors initiate crises in the first place and their estimations on likelihood of victory. Emerging technologies have the potential to exacerbate escalation dynamics by shaping decision-makers’ perceptions of their military capabilities, i.e. the costs they can impose, strategic advantage, and potential costs they will have to absorb vis-à-vis a potentially weaker adversary.

There are important commonalities across these theories. To avoid nuclear war, avoid conventional war. Information and perception are fundamental factors in whether or not technologies lead to escalation. And emerging technologies add complexity and uncertainty into crisis decision-making. In this article we do not set out to test theories against each other, but rather to use them as frames for exploring the impact of emerging technologies on the war in Ukraine as having escalatory effects by crossing domains, jumping “rungs” on the escalation ladder, increasing risks of misperception, and influencing willingness to accept risk, uncertainty, and costs.

Emerging Technologies in the War in Ukraine

Emerging technologies have been extensively used in the war in Ukraine. Both Russia and Ukraine have deployed a range of innovative military technologies, from armed drones and electronic warfare systems to cyberattacks and AI-enabled intelligence gathering. Many experts expected Russia’s larger military and investment in modernizing its forces would give it a technological advantage; however, Ukraine has effectively leveraged emerging technologies to resist and launch counteroffensives. Notwithstanding the tragedy of the situation, the war in Ukraine has given us a new set of datapoints on the role of emerging technologies in warfare. The insights from the role of emerging technologies on the war in Ukraine are not straightforward. On the one hand, this war has been very conventional and exhibited a high degree of continuity with warfare of previous decades (or even centuries). On the other hand, emerging technologies have changed the character of war by creating new vectors for crisis escalation, new capabilities on the battlefield, and an opportunity for new actors to play a key role in conflict. It is worthwhile to explore many of these technologies in isolation, as a suite of capabilities, and as a wider phenomenon in the war in Ukraine.

Hypersonics

Hypersonic weapon systems are capable of exceeding speeds of Mach 5 within the atmosphere. A combination of speed, maneuverability, stealth, and ability to evade defensive systems makes hypersonic weapon systems unique. Russia has used the air launched hypersonic Khinzal on at least three occasions in Ukraine to hit hardened targets (US News Citation2022). This may have been a show of force and intended to signal willingness to escalate, but numerous experts suggested this was “serious overkill”, wasteful, and an unnecessary show of force (Galeotti Citation2022). There is speculation that Russia was running out of other missiles (i.e. Iskander and Kalibr) or it was a propaganda tactic. Either way, this very limited use of hypersonics does not seem to have conferred any strategic advantage in the war in Ukraine for Moscow.

Cyber operations

Ukraine’s digital infrastructure was hit with cyberattacks hours before any missiles were launched or tanks were moved on 24 February 2022. A Microsoft study from April 2022 found that Russian cyber actors frequently conduct intrusions in concert with kinetic military actions. The study emphasizes the distorting capabilities of cyber operations, noting that “[c]ollectively, the cyber and kinetic actions work to disrupt or degrade Ukrainian government and military functions and undermine the public’s trust in those same institutions” (Microsoft Digital Security Unit Citation2022, 2). Network operations such as wiper attacks have not only degraded the functions of the targeted organizations (i.e. Ukrainian government, IT, energy, and financial organizations), but “sought to disrupt citizens’ access to reliable information … and to shake confidence in the country’s leadership” (Microsoft Digital Security Unit Citation2022, 2). While the cyber domain is not a novel threat vector, Russia has allegedly used AI to augment cyberattacks and Ukraine has used AI to augment cyberdefenses (L. Kahn Citation2022; Vectra Citation2022). The use of AI to detect, defend against, and facilitate cyberattacks has the potential to speed up discovery, evaluation, and response processes far beyond human abilities.

Deep fakes

Another widely publicized AI application in this war has been the use of AI for information warfare. AI can be utilized to distort and weaponize information in both peace- and wartime, including in the form of “deep fakes”, synthetic media in which an image or video is manipulated. Such techniques have the potential to generate a level of contrived realism that surpasses prior techniques used to falsify information. For example, Russia created and disseminated a deep fake video of Volodymyr Zelensky asking Ukrainian soldiers to lay down their arms and surrender that circulated on social media,Footnote2 but it failed to convince viewers of its veracity and was swiftly removed from social media platforms (Simonite Citation2022). Zelensky doubtlessly benefitted from his high profile and the stakes at play, however, it also underscores that deep fakes are not hugely successful tools for grand deception.

Drones

The Russian military is using a lineup of military-grade unmanned aerial vehicles (UAV) or drones, in this war to conduct intelligence, surveillance, and reconnaissance (ISR) and support combat missions over Ukraine (Bendett and Edmonds Citation2022). Russia has claimed that it used an AI-enabled (i.e. autonomous) drone, but this claim appears to be all signal and no substance. Though the manufacturer claims in promotional materials that the KUB-BLA can independently identify targets (Knight Citation2022), there is no evidence that it was used to do so in Ukraine. Ukraine has also been successful in its counter-drone measures and are themselves using UAVs to great effect. Most prominently, the Turkish-made Bayraktar TB2 (i.e. a medium-altitude long-endurance drone) has played a role in targeting and countering Russian military advances, alongside much smaller drones for ISR (McGee Citation2022; M. Jankowicz Citation2022). In addition to the military drones that are being used for surveillance purposes and to strike targets on the ground, the scale of commercial UAV usage in Ukraine has become emblematic of Ukraine’s “Do It Yourself” methods to counter Russian attacks. Commercial drones are much cheaper than their military counterparts, easy to acquire, and are viewed as disposable, making them attractive to groups that do not have much money. Finally, it is worth noting the presence of the US-made (and donated) Switchblade and Russian-made Lancet and KUB,Footnote3 which strike targets in a kamikaze fashion. This is a loitering munition that hovers over battlefields, where they search for a particular class of targets. While humans are often involved in confirming a strike remotely, these operating systems are technologically capable of doing so on their own. Again, there is little evidence that these have independently identified targets in Ukraine, though the difficulty of determining when full autonomy is used in a lethal context is worth flagging.

Enterprise AI

Enterprise AI refers to the way an organization – or, in this case, a military – incorporates AI into its infrastructure. Potential uses include intelligence gathering and analysis, early warning, and just-in-time wargaming/simulations that generate AI-recommended courses of action. These processes augment and, in some circumstances, replace human perception and judgement. Both Ukraine and Russia are using AI to sift through the data generated by drones with ISR functions and commercial satellite images (Wyrwal 2022). Ukraine has also been using facial recognition technology to identify Russian operatives and soldiers, natural language processing for voice recognition, transcription, and translation services, and predicting Ukraine’s ammunition and weapons needs (Tucker Citation2022). Jim Mitre, the director of the International Security and Defense Policy Program at the RAND Corporation and former Defense official was quoted as saying, “It’s [Ukraine’s] ability to process information at a faster clip than the Russian’s that is having a big impact here” (Guyer Citation2022).

Anti-satellite capabilities

Anti-satellite (ASAT) capabilities fall into two categories: ground-to-orbit capabilities, which deliver a kinetic or directed-energy effect from Earth to targets, and co-orbital capabilities, which deny or degrade space assets and enable the covert/overt modification of satellites on orbit. ASAT capabilities are not new technologies, but their use is concerning for the growing number of state and non-state actors who are reliant on a sustainable space environment. Based on publicly available data, many ASAT activities in the war in Ukraine have comprised the use of directed energy to detect, jam, and otherwise thwart signals from satellites. Electronic warfare is integral to how modern militaries fight (Atherton Citation2022), a fact that was acknowledged by the head of Roscosmos (i.e. the Russian space corporation), who announced in March 2022 that Russia will treat any offlining of its satellites as a justification for war (Reuters Citation2022b). Nonetheless, Russia conducted a cyberattack against the satellite internet network VIASAT in May 2022 with the intention of disrupting Ukrainian command and control (Reuters Citation2022b). The hack had immediate knock-on consequences for satellite internet users across Europe. In March 2022, Ukrainian forces reportedly captured a Krasukha-4 electronic warfare system left by the retreating Russian army (Trevithick 2022). Russia has increasingly made use of electronic warfare to detect, intercept, and disrupt communications as cyberattacks and ground strikes against Ukraine’s communications infrastructure have proved to have limited impact (Antoniuk Citation2022).

False Sense of Supremacy and Avoiding Complacency

To better understand the impact of emerging technologies on crisis escalation in Ukraine, we can examine these trends through the lens of the theories of escalation. Looking first to wormhole escalation, emerging technologies did not have a significant escalatory impact on the battlefield in Ukraine once the conflict began. Concerns about jumping “rungs” on the escalation ladder have not (yet) been realized, although Russian military discussions about tactical nuclear weapons use suggest there is potential for wormhole escalation, particularly in situations where Russia faces conventional defeat with few remaining military options. The use of emerging technologies in Ukraine has expanded the war across domains in potentially destabilizing ways. For example, cyberattacks against civilian infrastructure in Ukraine and against NATO members that supply Ukraine risk provoking a wider war. Emerging technologies have also increased uncertainty given their novelty and impact, complicating signaling and the risk of miscalculation. But overall, emerging technologies have not been the decisive factor on the battlefield; for example, Russia’s performance in the war is largely being determined by manpower and command issues, rather than any new technologies. This war has resembled warfare of previous centuries to a greater extent than it resembles science fiction.

Turning to inadvertent drivers of escalation, there is little evidence of this in the war in Ukraine. Russia’s invasion was decidedly intentional, and not due to misperception of any imminent threat by Ukraine or NATO, nor was it due to any pre-emptive threats to its strategic forces that would have created “use it or lose it scenarios”, even at the conventional level. For scenarios of “entanglement”, this theory of escalation is somewhat problematic in the case of Ukraine because only Russia, not Ukraine, possesses nuclear weapons; however, with the growing involvement of Western and NATO forces, this risk should nonetheless be taken seriously, and there are other risks of inadvertent escalation aside from “entanglement” scenarios.

Finally, the cost tolerance escalation theory suggests perceptions about supremacy prior to a conflict and likelihood of success, to include the impact of emerging technologies, may have had an escalatory effect by emboldening Russia’s actions. Prior to February 2022, Russia was perceived to have overwhelming military superiority vis-à-vis Ukraine, including in Western intelligence analyses. To be sure, emerging technologies were not the only factor that contributed to Russian miscalculation. Centralized decision-making, underestimating Ukrainian resolve, and discounting Western support also led to misperceptions in Moscow and elsewhere. Perceived supremacy may have been particularly acute following Russia’s military buildup over the previous two years and development of advanced capabilities such as dual-capable hypersonic weapons, disinformation campaigns, and a fleet of drones.

To summarize, emerging technologies did not have a noticeable escalatory nor de-escalatory effect during the Ukraine crisis that would make them unique from other military capabilities and applications. Where emerging technologies potentially did have an escalatory effect was prior to the crisis by giving Russia a false sense of supremacy and shaping perceptions in Moscow about the likelihood of success and relatively low costs of invasion. This argument comes with an inherent humility, as no one can know what Putin’s perceptions were and Kremlin decision-making is notoriously murky. While the tactical impacts might be limited, the strategic impacts of emerging technologies more broadly point to concerning trends for escalation. Nonetheless, there are other trends in the war in Ukraine that should caution against a false sense of complacency about the escalatory potential of emerging technologies, to include potential for wormhole and inadvertent escalation. Three such trends include new actors, new escalation pathways, and new timescales.

New Actors

Technology companies – to include drone manufacturers, AI companies, commercial space actors, social media companies, and cybersecurity companies – have acted as force multipliers in the Ukrainian war effort on both sides. Gregory Allen from the Center for Strategic and International Studies suggests that this war has highlighted the relevance of commercial-off-the-shelf technology in war: “Across drones, [AI], and space, commercial technology is flexing military muscle to a greater extent than at any time since the end of the Cold War” (Allen Citation2022). One notable example is the Chinese-made DJI Mavic drones, which have emerged as a key combat multiplier in the war in Ukraine. In another example, civilian researcher Faine Greenwood has used media shared on Twitter, Telegram, YouTube, and other sites to track and log nearly 800 incidents in which consumer drones have been used in Ukraine. In March 2022, Ukraine accused DJI of supplying Russia with its proprietary AeroScope drone detection software to target Ukrainian forces flying the drones. As of April 2022, DJI halted sales in Russia and Ukraine to prevent the use of its drones in combat (Kirton Citation2022). But DJI are far from the only private sector actors who are playing a role in Ukraine.

In the cyber domain, Ukraine’s defensive capabilities have been enhanced by cybersecurity companies and so-called “hacktivists”. As regards the former, cybersecurity companies such as Microsoft, Bitdefender, CrowdStrike, and Vectra AI have extended their support to Ukraine to combat Russian ransomware, wipers, spear-phishing, and distributed denial of service (DDoS) attempts. As regards the latter group, the online hacker collective Anonymous declared “cyber war” on Russia when the invasion began (Anonymous 2022) and its members have devoted their skills to activities that range from the banal (e.g. DDoS attacks and data theft) to the imaginative (e.g. creating a massive traffic jam in Moscow by ordering dozens of taxis from the ride-hailing app Yandex Taxi to converge on the same location) (Gordon and Franceschi-Bicchierai Citation2022). Anonymous comprises part of the “IT army” of Ukrainian volunteers. The role of both private cybersecurity companies and hacktivists reinforces our argument that new actors are playing a role in safeguarding the information ecosystem.

In the space domain, Ukrainian Minister Fedorov sent an appeal via Twitter to the world’s commercial satellite companies on 1 March 2022, requesting that they share imagery and data directly with Ukraine. Commercial space capabilities answered the call with a range of providers granting near-real time monitoring of Russian military activities and support to Ukrainian forces, humanitarian organizations, and journalists covering the invasion. The satellite company Planet Labs has played a particularly significant role in this conflict. Satellite imagery can be used to track damages directly (e.g. by spotting fires, changes in the landscape, or direct damage on known sensitive sites) or in combination with social media information (100 Days in Ukraine, 2022). The company also plays a role in the information ecosystem; as recently as August 2022, the company released images that contradicted Russian claims regarding a recent attack on a Russian military base in Crimea (Borowitz Citation2022). The activities of commercial satellite companies are, of course, on top of private space companies like SpaceX, who provided internet-enabling satellites to Ukraine after their internet was disrupted following Russia’s invasion.

The enterprise AI functions mentioned in the previous section, namely the use of AI for facial recognition and natural language processing, have also been enabled by private sector actors. The Ukrainian MoD has been using Clearview AI facial recognition software to build a case for war crimes and identify deceased soldiers. It is also being used at checkpoints and could help reunite refugees with their families. Meanwhile, the AI company Primer has modified their commercial AI-enabled voice transcription and translation service to listen in on intercepted Russian communications and automatically highlight information of relevance to Ukrainian forces in a searchable text database. Both companies have granted Ukraine free access to its software.

Finally, social media companies have acted as gatekeepers for online civic culture in this war. Since its early days, Google, Meta, Twitter, Snapchat, and TikTok have actively removed or demoted Kremlin propaganda or refused to run ads in Russia. Some commentators speculate that these companies are keen to rehabilitate their reputations after facing questions in recent years over violating anti-trust law, infringing on privacy, and spreading toxic and divisive content. This is a point that will be picked up on in the following subsection.

New Escalation Pathways

Social media has emerged as a new vector for information dissemination – both true and false – but also provided an opportunity for social media companies to assume more responsibility for maintaining a healthy online civic culture. From the Russian perspective, social media has played a crucial role in the spread of false claims, assisted by dozens of Russian government Twitter bot accounts. Russian disinformation campaigns have sought to delegitimize Ukraine as a sovereign state, sow doubts and mistruths about Neo-Nazi infiltration in the Ukraine government, spreading “whataboutisms” that downplay the Ukraine invasion by drawing attention to alleged war crimes perpetrated by other countries, and spreading conspiracy theories about US biological weapons laboratories in Ukraine (Graham and Thompson Citation2023). Russian disinformation campaigns have featured fake and heavily manipulated videos, which were discussed in the previous section.

Ukraine’s social media prowess also warrants attention. President Zelensky has made extensive use of social media to broadcast selfie-style videos to the outside world, culminating in the so-called “Zelensky Effect”. An actor-turned-politician, Zelensky is a natural messenger for the social media moment. His appeals are complemented by those of Minister Fedorov, who has launched public pressure campaigns that effectively recruited some of the world’s most powerful technology companies (e.g. SpaceX, Apple, DJI) to join the Ukrainian war effort (Zakrzewski and De Vynck Citation2022). More broadly, Ukraine’s official media team has used social media to unflinchingly document their grim reality. Some experts maintain that Ukraine’s tight focus on social media has been key to its success (Khan Citation2022).

Whilst social media platforms have emerged as amplification systems for true and false claims, social media companies have simultaneously shown an unusual willingness to take a stand. These companies have prioritized their responsibilities to Ukrainian users and their ties to democratic governments over their desire to remain neutral, even at the cost of being banned in Russia (Oremus Citation2022). This information blockade appears to be having some success in convincing Russians that the war is justified (Dougherty Citation2022). Moreover, these decisions have illustrated how internet platforms have been scrambling to adapt content policies built around notions of political neutrality to a wartime context (Oremus Citation2022). Many commentators have been surprised to witness an industry that has long been reluctant to bend to political demands submit absent any legislation or economic leverage.

New Timescales

Many of the technologies explored here are potentially disruptive because of their speed and ability to compress decisionmaking. It is worthwhile considering how new technologies create new timescales of effects in a crisis. For example, the Russian use of deep fakes has clearly failed to achieve strategic or tactical objectives in the short term. Meanwhile, Ukraine’s successes at defending against attacks on their information environment is attributable (at least in part) to the fact that they are well-acquainted with Russian tactics from decades spent under Soviet rule (N. Jankowicz Citation2020, 128). Political correspondent Vera Bergengruen reaffirms this, asserting that Russia “[floods] the zone with so much information that people don’t really know what to believe. And in this case, Ukrainians were able to get ahead of it” (Davies Citation2022). Ukraine debunked – and even “prebunked” (i.e. warned citizens of the possibility of Russian information operations before they were disseminated) – disinformation narratives before they were able to take root via, inter alia, a sophisticated social media outreach strategy.

What this leaves out is the effects of AI-enhanced information operations in the longer term. While synthetic media could theoretically lead to the use of nuclear weapons (e.g. a deep fake video of a head of state ordering their use), several verification measures would need to fail for such an order to be observed. Experts suggest that deep fakes are more likely to complement a kinetic strike or cyberattack (Favaro, Renic, and Kuhn Citation2022). In these cases, the objective could be to throw the target off-guard before a strike and/or delay a response until it is too late. Perhaps more concerning is the possibility that the compound effect of sustained information operations could create divides in threat perceptions amongst national security staffers, including what information they found credible. This could muddy the strategic waters for decision makers, even if they give more credence to intelligence reports and internal briefings than they do to live news and social media feeds. Finally, this technology could destabilize public opinion and create pressure on decision makers to act quickly in a crisis. On a societal level, one longer-term effect could include the “truth decay” phenomenon, which refers to a diminished reliance on empirics and analysis within society broadly, and especially within political discourse (Kavanagh and Rich Citation2018). That is, of course, in the absence of initiatives that strengthen societal resilience to disinformation (N. Jankowicz Citation2020).

Reflecting on what is potentially new or novel about the impact of emerging technologies on the course of the war in Ukraine: The use of drones and satellites in war are not new, however, the use of small, commercial drones and satellites in war are relatively new. Small, commercial drones and satellites being used in a war between two relatively advanced militaries is new. The manufacturers of these drones and satellites being seen as alternately heroic – as in the case of SpaceX’s Elon Musk – or suspect – as in the case of Chinese-owned DJI Drones – is new. In summary, the first change caused by emerging technologies are the new actors that are exerting influence on the information ecosystem and the course of the war more generally. Second, emerging technologies have generated new vectors for information diffusion (and thereby for crisis escalation), which includes social media platforms, whose owners find themselves in a content moderation role. The third change prompted by these emerging technologies is the timescales of effects. Not unlike the landmines of previous wars, which continue to cause damage long after war ends, the compound effect of sustained information operations could create a polluted information environment with adverse effects for years to come. Returning to our analytical framework, however, many of these emerging technologies will inform pre-crisis escalation decisionmaking and will likely be assessed as a suite of military capabilities and their applications. It is not a new phenomenon for perceived technological advantage to lead to offensive military action or crisis escalation. But the new actors, new pathways, and new timescales discussed above may increase risks of misperception around the costs and benefits associated with those new technologies.

Implications for Escalation Management, Risk Reduction, and Arms Control

Our primary finding that emerging technologies could give a false sense of supremacy could be exacerbated over time and complicate policy options for escalation management, risk reduction, and arms control. Emerging technologies could embolden states to favor offensive action. But they could also lead to countries increasing reliance on nuclear weapons. For example, in the case of Ukraine, Russia will likely increase reliance on nuclear weapons due to its weakened conventional forces as a result of the war and as it rebuilds non-nuclear capabilities in its aftermath. Additionally, nuclear weapons can underpin regional aggression, coercion, or opportunism by states such as Russia and China.

While emerging technologies have highlighted the role of new actors and tools in conflict, it is too soon to conclude whether they have a consistent impact on crisis stability. Many of these technologies remain under-development and their future applications and implications are not yet fully understood. There have been significant investment from major powers like the United States, China, and Russia in military AI applications. These capabilities are expected to play a key role in future wars. However, most of the military AI applications are still in early research, development, or testing phases. Importantly, neither Russia nor Ukraine has deployed AI-enabled weapons that autonomously select and engage targets in the current conflict (L. Kahn Citation2022). Even in the cases of enterprise AI functions such as those provided by Clearview and Primer that are being used for intelligence analysis, there is limited publicly available data to meaningfully evaluate their use and impact. The available evidence from Ukraine suggests military AI applications are still too unreliable for combat use or to be trusted by human operators. Russia attempted to showcase advanced weapons like AI-enabled drones and hypersonics in Ukraine. But many of these capabilities either underperformed or were exaggerated by Russian claims. For example, the KUB-BLA drone has autonomous functions but there is no evidence they were used.

Another reason for a more nuanced and balanced approach to the impact emerging technologies on the battlefield is that the majority of scholarship has focused on their offensive applications, but there will also be parallel developments in defensive and protective measures. Russia seemingly had technological superiority and tried to leverage it against Ukraine through cyberattacks and denial of internet access. Ukraine successfully deflected most of these attempts, often with help from private actors like SpaceX. Analysis from June 2022 suggests that only 29% of Russian cyberattacks conducted in the first months of the war were successful (Smith Citation2022). This is attributable to the fact that Ukraine was well-positioned to fend off cyberattacks, after having endured them for many years. Cybersecurity efforts were also bolstered by companies providing threat intelligence and incident response services. This highlights the many roles private sector actors have played in this war. Russia’s poor performance in Ukraine demonstrates that technological superiority alone does not guarantee military gains. Many emerging technologies may offer tactical advantages, but these need to be considered in the broader political context – including external support, asymmetry of stakes, and cost tolerance. One potential lesson is that emerging technologies offer limited strategic advantage and increase reliance on nuclear deterrence for existential scenarios and war termination. This is not a foregone conclusion however, as many technologies like synthetic bio could have much greater strategic impacts as they mature.

A challenge for managing the risks associated with many emerging technologies is the involvement of new actors like the private sector that sit outside traditional arms control mechanisms. To mitigate this challenge, private companies need to be included in discussions and policy development around arms control and risk reduction. For example, Western governments will need to break down silos between the public and private sectors to understand the strategic implications of emerging technologies, where they are proliferating, and how they can be weaponized. More broadly, governments and industry should support inclusive dialogues on the crisis risks of emerging technologies. Private companies like SpaceX or Twitter could sponsor Track 1.5 discussions with experts from industry, think tanks, and governments to identify specific risks. Governments should also break down silos between nuclear and non-nuclear states to develop joint risk reduction tools. Many agree that emerging technologies increase the chance of nuclear use and our research shows this might be the case if technologies give states a false sense of supremacy and embolden offensive action. But multilateral institutions have been slow to address them due to complexity, crowded agendas, and polarization. Initiatives like Creating an Environment for Nuclear Disarmament (CEND) and the Stockholm Initiative may be well-positioned alongside the Non-Proliferation Treaty (NPT) to drive progress on risk reduction and emerging technologies. Non-nuclear states especially have a role to play in countering disinformation and deep fakes.

In addition to including new actors, traditional arms control approaches will need to evolve to address the novel escalation pathways created by emerging tech. Behavioral arms control initiatives like a ban on ASAT tests or limits on manipulating the information environment during crises could help such efforts. A Disinformation Risk Reduction Network could be created for public and private actors to share threat intelligence and coordinate responses. However, the current geopolitical climate makes major new treaties unlikely. In parallel, governments and companies must build resilience to potential manipulation and interference. This should prioritize two domains: social media and satellites. Social media shapes narratives with potentially escalatory effects. Resilience efforts should focus on civic engagement and educating publics on disinformation. Social media platforms have a key role and responsibility in this. Satellites are also crucial as they provide communications and situational awareness during crises. Ukraine demonstrated the consequences of poor intelligence for Russia. Resilience in space through protected communications and rapid reconstitution/redundancy of satellite capabilities will be critical.

The compressed timescales driven by emerging technologies put a premium on slowing the pace of conflict and maintaining open channels of communication during crises. The demise of critical risk reduction agreements like the INF Treaty is an unfortunate example of what can happen when such channels break down. To avoid this, governments will need to proactively establish lines of communication with private sector leaders. For example, a modern hotline agreement could connect the US President and industry executives during a crisis to coordinate information flow and content moderation. More broadly, all actors should invest in sustained dialogue and signaling well before any triggering events. This will avoid miscalculation and create trusted backchannels to manage any rapid escalation that does occur.

Conclusions

Returning to our initial research questions: How will emerging technologies impact crisis escalation? Thus far, drawing on the case study of the war in Ukraine, emerging technologies have had an emboldening effect by reducing the perceived costs of military escalation for the technologically superior country, Russia. Emerging technologies have the risk of imposing a false sense of supremacy on states considering instigating a military operation or escalating a conflict. Beyond this, there is limited evidence that emerging technologies had an escalatory or de-escalatory effect on the war in Ukraine on the battlefield. What has been the escalatory (or de-escalatory) effect of emerging technologies in contemporary crises? Emerging technologies may have had an emboldening escalatory effect by shaping Russia’s perception of costs associated with the invasion of Ukraine and the likelihood of success. The case of the war in Ukraine also points to the role of new actors, new escalation pathways, and new timescales in contemporary conflict, all of which are linked to the impact of emerging technologies. And can the use of emerging technologies increase risks of nuclear use? All three theories of escalation explored here point to the risk that any conventional war has the potential to escalate to nuclear use, albeit along different pathways. By bestowing a false sense of supremacy, emerging technologies could embroil nuclear actors in crises they otherwise would have avoided and result in strategic quagmires with potentially high costs of conflict termination, either politically or militarily.

When interpreting signals of change, skepticism is crucial. On the one hand, the performance of Russia’s conventional forces should raise bigger questions about the state of its military-industrial apparatus. Uncertainty surrounding Russia’s general technological trajectory and its ability to compete with the United States was already a matter of debate among experts before the war in Ukraine. Given the manifold economic, financial, and technological sanctions imposed on Russia after its renewed attack on Ukraine, the existing barriers to Russia’s acquisition and deployment of emerging technologies might only increase in the years ahead. On the other hand, there is cause for skepticism vis-à-vis Ukraine’s use of new technologies, such as AI for facial recognition and natural language processing. Who is using these capabilities and to what effect? It seems unlikely that these enterprise AI functions could be seamlessly integrated into existing military structures without substantial changes to doctrine, organization, training, materiel, leadership and education, personnel, and facilities (i.e. DOTMLPF). This calls into question whether this is mere posturing on behalf of Ukraine and/or a successful PR campaign by the AI companies. Moreover, this highlights the role that perceptions play in determining the effects of emerging technologies. Technologies that distort the information environment in particular challenge and confuse perceptions. And emerging technologies can also shape perceptions of likely costs and benefits prior to a crisis.

Acknowledgement

The authors are grateful to Suzanne Claeys and Reja Younis for their research assistance with this article.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Notes

1 An alternate definition comes from North Atlantic Treaty Organization (NATO) as denoting those technologies or scientific discoveries that are expected to reach maturity in the period 2020–2040; and are not yet widely in use or whose effects on defense [and] security are not entirely clear (NATO Science and Technology Organization [STO]).

2 It is unsurprising that we have seen deep fake technology deployed by Russia in the war in Ukraine, due it its lower price tag (relative to other emerging technologies surveyed in our study) and its confluence with Russia’s subthreshold warfare doctrine. Furthermore, using AI to enhance active measures and influence campaigns is a natural outgrowth of Russia’s history of information operations.

3 In June 2022, the Russian military admitted that it is also using Lancet in Ukraine, possibly adding more fuel to the debate about the use of AI in this war, considering that this particular munition was advertised as “highly autonomous” for target identification and destruction (Tass Citation2022).

References