3,227
Views
1
CrossRef citations to date
0
Altmetric
Articles

Exploring the human dimension of nuclear security: the history, theory, and practice of security culture

ABSTRACT

Over the past two decades, the international community has devoted considerable attention to the human dimension of nuclear security. This trend is part of a more holistic approach to securing nuclear facilities, grounded in the concept of culture, that moves beyond the traditional focus of physical-protection measures. But what explains this shift and what does it entail? This article begins by demonstrating, through a series of real-life case studies, the potential for human factors to undermine nuclear-security systems. It then considers the rise and consolidation of “culture” as a concept used to better understand and organize international efforts to strengthen nuclear security. Nuclear-security culture is then explored in practice, drawing on a review of relevant initiatives as well as empirical research conducted by the authors at several UK nuclear sites. A number of likely challenges for developing an effective nuclear-security culture at the operational level are discussed, as is the value of the culture-focused guidance developed by the International Atomic Energy Agency. The article concludes that while nuclear-security culture has been widely promoted at the international level, there exists considerable scope for new initiatives to further strengthen engagement at the working level of industry.

For decades, security at nuclear facilities was synonymous with physical protection, focused on denying external adversaries access through a combination of “guns, guards and gates.”Footnote1 However, this focus has shifted over time, driven by the realization that these measures alone are unlikely to be effective against the full spectrum of contemporary threats. In particular, much attention has been devoted to the potential for security systems to be undermined from within. This problem may occur due to the actions of otherwise well-intentioned employees who unwittingly create security-related vulnerabilities that are exploited by others. Security may also be defeated through the deliberate actions of individuals with malicious intent and authorized access to nuclear assets, so-called “insiders,” who have been responsible for almost all known thefts of highly enriched uranium (HEU) and plutonium, materials that could be used in a devastating act of nuclear terrorism.Footnote2

As part of the effort to tackle these risks, considerable attention has been devoted to the human dimension of nuclear security. This includes the introduction of preventative personnel-focused security measures designed to reduce the risk of employees intentionally “exploiting their legitimate access to an organization’s assets for unauthorized purposes.”Footnote3 In the United Kingdom, for example, nuclear workers typically undergo a series of background checks as part of the recruitment process, aimed at identifying and excluding people with existing high-risk behavioral traits and backgrounds. After they begin their employment, individuals may also be required to undergo regular drug and alcohol testing, as well as monitoring by systems aimed at flagging significant behavioral changes.Footnote4

To be effective, the application of preventative measures requires the involvement of all staff, not just those working in security-related roles. This includes employees in human-resources, business, technical, operational, management, and other positions, as it is these individuals who may be most likely to notice significant changes in a co-worker’s general behavior or discern nonroutine acts that may indicate malicious intent. Engaging the full workforce in security also acts as a “force multiplier” for identifying broader security failings related to protective measures and enabling timely corrective action. This more holistic approach is generally framed in terms of “security culture,” a term intended to capture the shared nature of responsibility for nuclear security.

In the nuclear context, the concept of security culture has powerful political salience, something that was emphasized by the nuclear-security-summit (NSS) process, in which the leaders of key states and international organizations came together over the course of six years and four summits (2010–2016) to advance “a common approach and commitment to nuclear security at the highest levels.”Footnote5 Nuclear-security culture featured prominently in the summit communiqués and work plans, as well as in various individual commitments and pledges made by the states in attendance.Footnote6 Yet despite growing international recognition of its importance, particularly at the level of international politics, security culture as a key facet of nuclear security remains relatively underdeveloped in research terms. Few academic studies have examined its emergence as the conceptual anchor for efforts to strengthen the human dimension of nuclear security or explored how this has been translated to the operational level.

Against this background, this article begins by illustrating the different ways in which the human factor can undermine nuclear security. The analysis here is supported by examples from three real-life cases in which weaknesses on this front proved decisive in facilitating unauthorized access to nuclear facilities, the theft of nuclear materials, and the sabotage of nuclear systems, respectively. It then reviews how security culture has developed as a concept in the nuclear context, starting with its origins in nuclear safety before its recent rise to prominence in the rhetoric and practice of nuclear security. The final part of the article explores the value of culture-focused guidance developed by the International Atomic Energy Agency (IAEA) in identifying and addressing potential nuclear-security problems. The analysis is supported by a review of relevant international initiatives, as well as empirical research conducted by the authors at a number of nuclear sites managed by the United Kingdom’s Nuclear Decommissioning Authority (NDA).Footnote7 The article concludes with recommendations for future nuclear-security-culture efforts.

Exploiting human weaknesses in nuclear security

Although a major act of nuclear terrorism has yet to occur, there are numerous examples of nuclear materials and information being stolen and systems deliberately sabotaged.Footnote8 Three real-life cases, discussed later, illustrate how human weaknesses can be identified and exploited by adversaries, undermining nuclear-security efforts.

Security systems at nuclear facilities are designed and operated on the principles of detection, delay, and response.Footnote9 To be effective, security measures must ensure that an adversary action is quickly detected. The delay function is then designed to slow the progress of the adversary long enough for an appropriate response to be executed. At stake here are competing timelines—the adversary racing to defeat security measures and the guard force racing to neutralize the threat. The tension between them is at the center of all security systems.Footnote10 The following three cases show how human failings can undermine different components of a nuclear-security system, from the initial detection of an adversary to the measures intended to delay their actions to the ability of authorized responders to conduct a swift and effective response.

Wilmington, United States, 1979

An early example of nuclear-security failure is the theft of low-enriched uranium (LEU) from a General Electric fuel-fabrication facility in Wilmington, North Carolina, in the United States.Footnote11 Over the course of a single hour, the perpetrator, a contracted laboratory technician named David Learned Dale, demonstrated how an insider can test security systems, identify specific weaknesses, and then exploit them sequentially for illicit purposes.

Dale’s theft took place on January 29, 1979, in the evening, when the plant was relatively deserted. He gained entry to the facility along with the night shift around 11 p.m., when security guards failed to notice that the identification card presented by Dale was not an authorized access card. Instead, it was his Florida driver’s license. Dale presented it in the knowledge that the license had the same blue background as the badge of a permanent employee, which, crucially, permitted off-hours access to the facility. It was not the first time he had used his driver's license to enter the facility, having successfully tested access-control procedures on several prior occasions.Footnote12 After driving into the plant, rather than proceeding to the employee parking lot, he headed to a sensitive area of the site to park adjacent to the building where he worked, which contained the LEU he intended to steal. Vehicle access to this part of the site would normally have been restricted, but a gate had been removed to allow the installation of truck scales.Footnote13

After entering the building, he accessed the uranium vault through a door that was normally locked but had been left open because of a malfunctioning locking mechanism. He transferred a quantity of LEU to his car using laboratory equipment before driving out of the facility by the same way he entered, just before midnight. The timing was central to Dale’s success; if he had left after midnight, he would have been required to sign out and his presence onsite would have been recorded.

In stealing the LEU, Dale used his knowledge of the facility to identify a series of security weaknesses, which he exploited to avoid detection. These included a lack of vigilance by the guard at the facility entrance and the failure to establish temporary security measures following both the removal of the gate and the malfunctioning of the lock. Had mitigating measures been put in place, they might have delayed Dale sufficiently to make it impossible for him to accomplish the theft in the one-hour window he had available.

In the end, the consequences of Dale’s theft proved relatively minor. He was quickly identified as the perpetrator after a botched extortion attempt, and the stolen uranium was recovered a few days later.Footnote14 Yet this is not always the case, as illustrated by what is arguably the most significant nuclear-security incident on record: the bombing of Koeberg Nuclear Power Plant in apartheid-era South Africa.

Koeberg, South Africa, 1982

The Koeberg attack was carried out by Rodney Wilkinson, a white South African contractor who had worked at the plant for several years during the final stages of its construction. Wilkinson was supported by the guerrilla army of the African National Congress (ANC). Over several months, Wilkinson acquired ANC limpet mines, which he smuggled into the plant. He concealed the mines in a hidden compartment in his car and then hid them under his overalls as he passed through security. In preparing for the attack, taking an approach similar to Dale’s, Wilkinson reportedly tested security systems at Koeberg by smuggling into the facility whiskey and vodka bottles that were about the same size as the limpet mines.Footnote15 He was apprehended at least once with the illicit alcohol but escaped with a verbal warning about the possession of contraband. The mines exploded as planned on December 18, during the weekend, when the plant was relatively deserted. However, the attack resulted in significant financial costs, delaying the commissioning of the plant by 18 months.Footnote16

Certain senior officials at Koeberg were aware that the site was highly likely to be targeted by the ANC.Footnote17 According to Paul Senmark, a senior manager at Koeberg, “[W]e even pinpointed 16 December 1982, which was a public holiday, as the likely date [of an attack].”Footnote18 Despite these concerns, security at the plant remained limited, with requests for more stringent measures ignored in favor of broader commercial considerations. According to Peter Wakefield, then the assistant operating manager of the plant, while the South Africans were well aware of the ANC threat, they encountered resistance from the French construction firm, “who had warned that if security held them up [, they] would claim compensation.”Footnote19

These examples provide useful insights into various ways the human dimension can frustrate and render nuclear-security systems ineffective. Yet it might be argued that the historical nature of these cases undermines their relevance to the contemporary context. In the decades since the events described above, nuclear security has progressed considerably. That said, the next case demonstrates that despite these advances, the human dimension continues to be an enduring challenge for those tasked with securing nuclear facilities.

Y-12, United States, 2012

One of the most widely reported failures of a security system at a nuclear facility occurred in the United States on the night of July 28, 2012.Footnote20 The incident saw three anti-nuclear-weapons protesters gain unauthorized access to the Y-12 National Security Complex in Oak Ridge, Tennessee. This is one of the most sensitive nuclear sites in the United States, as it houses the Highly Enriched Uranium Materials Facility (HEUMF), which serves as the “nation’s central repository for highly enriched uranium,” a key nuclear-weapons ingredient.Footnote21 The significance of the facility meant that tens of millions of dollars were spent each year on security, with high-tech systems deployed for detecting intrusions and an armed response force stationed on site.Footnote22

Despite the presence of what was perceived to be a robust physical-protection system, the protesters were able to gain access and remain undetected and unchallenged on site at Y-12 for several hours. During this time, they reached the protected area surrounding the HEUMF.Footnote23

The incident triggered an investigation by the US Department of Energy and a broader Senate inquiry, both of which revealed systemic failures at multiple levels, including significant security-maintenance backlogs, overuse of compensatory measures, unrealistic testing practices, and the absence of appropriate regulatory oversight.Footnote24 For example, a key camera system that could have clearly detected the presence of protesters had been out of service for months, while other surveillance-related equipment had been poorly maintained and inadequately tested.Footnote25

These failings aside, multiple alarms were still triggered as the protesters cut through fences and walked across sensor fields. Crucially, however, the alarms were initially ignored by the security force at Y-12.Footnote26 Once the protesters reached the HEUMF, they banged on the side of the building with hammers, setting off a number of internal alarms. These alarms were manually silenced by guards at the HEUMF, following a flawed visual assessment of the local area with a camera system, not authorized for that purpose, that was unable to identify the protesters.Footnote27 Instead the noise was attributed to maintenance operations, despite the unconventional timing—the protesters struck in the early hours of the morning. These represent just some of the security weaknesses at Y-12 that were exposed by the incursion, with the Senate inquiry highlighting a broader underlying “security culture of complacency.”Footnote28

Ultimately, although it was a serious lapse in security, the incident did not result in any theft of or interference with HEU. With their “peaceful intrusion,” the protesters sought to promote “the cause of disarmament through symbolic action.”Footnote29 However, the actions of adversaries (broadly defined) are not always so benign.

The enduring importance of the human factor in nuclear-security systems

These three examples illustrate how weaknesses in human behavior—particularly when it comes to security planning, maintenance, operations, and testing—can render security systems ineffective. In the extreme case of Y-12, multiple failures rendered a technologically advanced and seemingly well-resourced physical-protection system vulnerable to three protesters with little knowledge of on-site security procedures. In the other two cases, insiders carefully planned their attacks, testing the security systems in place and identifying specific weaknesses that could be exploited. This type of behavior by insiders has proven to be relatively common. For example, a report by the US Nuclear Regulatory Commission (NRC) on insider threats revealed that in “nearly every” one of the 112 cases analyzed, insiders took advantage of security-system vulnerabilities.Footnote30 Consequently, even a small number of weaknesses may allow insiders to defeat a security system.

Digging deeper into the earlier two cases also serves to highlight the role that preventative security can play in mitigating threats. These types of measures were not common practice in the nuclear industry in the 1970s and 1980s, but it is useful to consider the likely implications if they had been. For example, if Wilkinson had been vetted before being hired, his superiors may well have discovered that he had previously deserted from the military and held strong anti-apartheid views.Footnote31 These factors alone would probably have disqualified him from a job at Koeberg. As for Dale, his brother testified at his trial that he was suffering from serious depression before the incident. This condition was attributed to impending unemployment, as his contracted position at Wilmington was due to end in the coming months.Footnote32 If a staff observation program had existed at Wilmington at the time, any changes in his behavior that were attributable to depression might have been noticed and reported by colleagues.

At the same time, the successful implementation of these and other security measures can be far from straightforward, a point that will be explored in more detail later in the article. This is partly because indicators of an insider threat may be subtle in nature and hard to identify, but also because elements of organizational culture may hamper their use. Sometimes even clear warning signs may not be noticed and reported through appropriate channels. This phenomenon is vividly illustrated in a study by Matthew Bunn and Scott Sagan, which examines a series of extreme insider incidents across a range of sectors. The study identifies several cases where “even the reddest of red flags” were ignored, despite preventative security measures in place.Footnote33

Nuclear security culture: evolution and concept

The relationship between the human factor and nuclear operations has been the subject of extensive study and debate for decades. Research in this area first began to take shape in the 1970s, when nuclear-power-plant analysts began to explore “effects of human performance in system reliability or safety studies.”Footnote34 This early work sought to adapt and build on the considerable body of knowledge and understanding that already existed in the field of human-factors engineering, whose focus was “the analysis and synthesis of systems in which both human and machine interact closely.”Footnote35 This approach held clear value since a key empirical finding in human-factors engineering was that “as large-scale human-machine systems become more complex, and as automation plays a greater role, accidents are increasingly attributed to human error.”Footnote36

The importance of this work and its relevance to nuclear safety had gained recognition by the mid-1970s. For example, a special inquiry into the accident at the Three Mile Island nuclear plant by the NRC revealed that “one of the NRC staff’s leading safety experts, Stephen Hanauer, had claimed in 1975 that ‘present designs do not make adequate provision for limitations of people.’”Footnote37 But only with the high-profile accidents at Three Mile Island in the United States (1979) and the Chernobyl plant in the Soviet Union (1986) did the significance of the human factor become a priority area for the international community and the nuclear industry. The commission established by President Jimmy Carter in the wake of the Three Mile Island incident found that “[t]he accident was initiated by mechanical malfunctions in the plant and made much worse by a combination of human errors in responding to it.”Footnote38 On a larger scale, the president’s commission found that both the nuclear industry and the NRC, its regulating body, had “failed to recognize sufficiently … that the human beings who manage and operate the plants constitute an important safety system.”Footnote39

Similar concerns with human factors emerged from an inquiry into the Chernobyl disaster seven years later. In its review, published only months after the events of April 1986, the International Nuclear Safety Advisory Group (INSAG), operating with a mandate from the IAEA director general, found that “[t]he root cause of the Chernobyl accident, it is concluded, is to be found in the so-called human element.”Footnote40 Specifically, “the accident was caused by a remarkable range of human errors and violations of operating rules in combination with specific reactor features which compounded and amplified the effects of the errors and led to the reactivity excursion.”Footnote41

It was at this point that the human factor began to be framed in broader terms, with INSAG identifying the “need for a ‘nuclear safety culture’ in all operating nuclear power plants.”Footnote42 This “preliminary concept was further developed by the IAEA in support of nuclear power-plant safety and evolved into a stand-alone initiative that has a direct application for a wide range of nuclear programs.”Footnote43 The IAEA’s adoption of safety culture as a key concept was representative of broader change in the usage of the term “culture.” William Sewell describes how, from the 1980s onward, the “intellectual ecology of the study of culture,” traditionally associated with anthropology, was “transformed by a vast expansion of work on culture.”Footnote44 Interest in the concept “swept over a wide range of academic disciplines and specialties.”Footnote45 One of these was management studies, where, “[i]n the early 1980s, ‘culture’ became a buzzword.”Footnote46 Ultimately, developments in this area inspired the approach to safety culture that the IAEA adopted. Specifically, it was the model of organizational culture and leadership developed by Edgar Schein that came to form the basis of the IAEA-endorsed approach to nuclear-safety culture.Footnote47

While the concept of culture became progressively more embedded in discourse on nuclear safety from the 1980s onward, only at the end of the 1990s did the idea of nuclear-security culture start to emerge in discussions of physical protection. In November 1999, IAEA Director General Mohamed ElBaradei convened an “Informal Expert Meeting to Discuss Whether there is a Need to Revise the Convention on the Physical Protection of Nuclear Material (CPPNM).”Footnote48 One of the outcomes of this meeting was a document containing “Physical Protection Objectives and Fundamental Principles.”Footnote49 Crucially, it established security culture as a fundamental principle, stating, “All organizations involved in implementing physical protection should give due priority to the security culture, to its development and maintenance necessary to ensure its effective implementation in the entire organization.”Footnote50

The terrorist attacks of September 2001 gave new momentum to international nuclear-security efforts, and, around this time, the concept of security culture began to feature more prominently in IAEA guidance documents and debates.Footnote51 In 2004, for example, the IAEA Code of Conduct on the Safety and Security of Radioactive Sources highlighted, as a basic principle, the need for the “promotion of safety culture and of security culture.”Footnote52 The following year, at an IAEA conference on nuclear security, held in London, participants from various member states highlighted the growing importance attached to nuclear-security culture while hinting at the challenges facing efforts to develop this concept, from ensuring common understanding of the term and its significance to addressing practical questions regarding implementation.Footnote53 One IAEA official noted,

As regards nuclear security culture, after the Chernobyl accident it was not very difficult to promote nuclear safety culture and translate it into safety management methodologies, but a lengthy ‘fermentation process’ was necessary before it became established. In my view, the establishment of security culture will require an even longer fermentation process.Footnote54

Yet awareness of the importance of security culture was already beginning to take more concrete form. In the 2005 amendment to the Convention on the Physical Protection of Nuclear Material (CPPNM), adopted by consensus, security culture was added as a “fundamental principle.” The amendment states, “All organizations involved in implementing physical protection should give due priority to the security culture, to its development and maintenance necessary to ensure its effective implementation in the entire organization.”Footnote55 This was followed, in 2008, by the publication of a dedicated IAEA implementing guide on nuclear-security culture.Footnote56 This further highlighted the importance of the human dimension in nuclear-security systems, noting,

A human factor is generally a contributor to all nuclear security related incidents … They include deliberate malicious acts, unintentional personnel errors as well as ergonomic issues related to the design and layout of software and hardware, inadequate organizational procedures and processes and management failures.Footnote57

This document crystallized broader international trends in nuclear security and laid the groundwork for subsequent work in this area. Indeed, it was an indicator of the significance now attached to culture in international thinking on nuclear security that a number of states soon began to explore how to measure and assess this concept. In 2012, for example, the National Nuclear Energy Agency of Indonesia (BATAN) launched a nuclear-security-culture self-assessment trial at its research reactors, supported by the IAEA.Footnote58 In 2014, a similar initiative was launched in Bulgaria.Footnote59 These early moves to assess nuclear-security culture were milestones; they fed into a broader IAEA effort to codify a common approach, taking form in new guidance published by the IAEA in 2017.Footnote60 This document sought to provide those seeking to conduct a self-assessment with the rationale and tools required to do so. A security-culture assessment is

a multi-stage process, with a heavy focus on perceptions, views, and behavior, whose goal is to help understand and explain the reasons for an organization’s patterns of behavior, devise optimal security arrangements, and predict how the workforce may react in certain circumstances.Footnote61

The work on self-assessment took place against the backdrop of the aforementioned NSS process, which gave the concept of security culture progressively more attention. Following the summits, many other countries have initiated new programs to assess and enhance nuclear-security culture. Organizations such as the World Institute for Nuclear Security (WINS), have sought to build on IAEA efforts, through developing additional nuclear security-culture-related guidance and tools.Footnote62

The IAEA model of nuclear-security culture

The preceding section charted how and why the concept of culture emerged and gained momentum in the nuclear-security context. This decades-long process culminated in an IAEA-endorsed model for nuclear-security culture and associated guidance. As was the case for nuclear safety, Edgar Schein’s formulation of organizational culture and leadership was at the center of the IAEA approach.Footnote63 The primary value of this model is arguably its simplicity and accessibility, which has enabled researchers and practitioners from a wide range of sectors to conduct socio-technical analyses of different organizational phenomena.Footnote64

Drawing on the field of anthropology, Schein defines culture as

a pattern of shared basic assumptions that was learned by a group as it solved its problems of external adaptation and internal integration, that has worked well enough to be considered valid and, therefore, to be taught to new members as the correct way to perceive, think, and feel in relation to those problems.Footnote65

Here culture is both “a stable property of a group” and a “perpetually emerging set of understandings.”Footnote66 In other words, the culture of an organization can evolve, but as it is often deeply embedded, it may take time and effort to change. To understand an organization’s culture, Schein proposes a simple three-layered model consisting of basic underlying assumptions, espoused beliefs and values, and artifacts.Footnote67 Artifacts are the visible, surface-level manifestations of an organization’s culture, including behaviors, documented procedures, management structures, and the actions of leaders. Underlying and driving these artifacts are espoused or expressed beliefs and values, such as teamwork or quality, which are communicated across a workforce, resulting in shared understandings. Beneath these espoused beliefs and values are shared, tacit assumptions, which may not be readily articulated, but can strongly influence behavior. For example, an organization may work on the assumption that operations should be safe or that a business should generate a profit.

In the IAEA’s model, these layers are used as an organizing framework, which outlines a series of key characteristics for each element deemed to be important in cultivating a culture that “leads to more effective nuclear security.”Footnote68 The model provides performance indicators for most characteristics which, in theory at least, provide a means of measuring or assessing different aspects of an organization’s culture that influence nuclear security. The model places considerable focus on leadership behaviors and management systems. This choice is consistent with Schein’s approach, in which the “concept of leadership is intertwined with culture.”Footnote69 Here, IAEA guidance is explicit in articulating what constitutes effective nuclear-security leadership, and management, outlining 25 characteristics and more than 300 associated indicators.Footnote70

Here it is important to acknowledge that distinct differences in national and operating environments for nuclear security mean that these characteristics and indicators may not necessarily apply, or apply equally, to all organizations. The IAEA acknowledges this point, emphasizing that the goal of its guidelines is to “stimulate further thought rather than be prescriptive.”Footnote71 Furthermore, the majority of the guidance’s characteristics and indicators are sufficiently generic in nature that they could apply to many functional areas, not just security. To be meaningfully applied, they must be tailored to specific organizations and their needs.

Nuclear-security culture in practice

How useful is the IAEA’s nuclear-security-culture model and associated guidance in understanding how an organization implements nuclear security? Do they provide a robust framework for identifying and rectifying potential weaknesses in human performance? And how have they been applied within the global nuclear industry?

At the operational level, nuclear-security culture may vary considerably, reflecting a multitude of factors that include differences in national, regulatory, and organizational approaches. Our study here is not exhaustive but aims to provide some deeper insight into this relatively unexplored aspect of nuclear security, highlighting potential challenges that may be encountered and probing the utility of IAEA guidance. Specifically, we aim to further understand how threat perceptions may develop in different occupational groups, why certain management and leadership practices can unintentionally trigger security-related issues, and the utility of different approaches to assessing nuclear-security culture. Our analysis draws on a review of relevant international initiatives and over 40 interviews conducted by the authors with practitioners from UK industry.Footnote72

Nuclear-security beliefs: threat perceptions versus reality

In the broader security context, studies have shown that if employees do not believe that a threat is real and significant, they are far less likely to internalize associated training, adhere to relevant processes, and give due consideration to the security-relevant aspects of their work.Footnote73 This finding is reflected in the IAEA model and guidance, which treat as the foundation of nuclear-security culture a recognition by all those working in a nuclear organization “that a credible threat exists and that nuclear security is important.”Footnote74 However, our interviews found that for this awareness to translate into effective security-related behaviors, it is necessary for this understanding to extend to the full spectrum of threats faced by nuclear organizations, with consideration of how they might manifest themselves in different working environments.

Threat assessment for nuclear-security planning is typically conducted at the national level and encapsulated in a document commonly known as a design basis threat (DBT), which contains estimates of adversaries’ intentions and capabilities.Footnote75 However, the information within a DBT and the sources upon which it draws are sensitive, and its distribution is typically restricted to senior nuclear-security personnel. Although unavoidable, this practice presents a challenge when it comes to communicating a realistic understanding of threats across an entire nuclear workforce. This challenge is further compounded by the relative rarity of serious security incidents at nuclear facilities compared to experiences in many other sectors.Footnote76

Given these challenges, it is perhaps not surprising that our research in the United Kingdom revealed significant variation in threat perceptions, particularly when considered across different occupational groups. As expected, security personnel involved in guarding facilities and responding to threats had the most nuanced and comprehensive understanding of the spectrum of threats facing their organization and the potential consequences of a successful attack. Their colleagues in roles that are not security-focused but contain a dedicated security component also demonstrated detailed task-specific knowledge. For example, individuals working in human resources were acutely aware of the threat posed by insiders, showed familiarity with recent examples of relevant incidents, and understood the role that vetting and other administrative processes could play in reducing this risk.

In contrast, employees in technical, commercial, or management roles tended to have a more abstract understanding of nuclear-security threats. Many of these interviewees’ threat perceptions were affected by contemporary events and media reporting. These individuals frequently referenced terrorist attacks in Manchester and London that occurred in 2017, less than six months prior to the interviews. This finding is consistent with previous studies that have shown that secondhand exposure via various media can strongly influence how individuals think and behave in relation to threats, in or outside of work.Footnote77 In our study, these widely reported major national-security events colored the lenses through which individuals viewed the threat. It was reflected in our interviews, which focused largely on scenarios involving heavily armed external adversaries motivated by ideology.

Most interview subjects in the latter occupational groups were skeptical of the idea of their own colleagues posing a threat. Many referenced the initial vetting that employees are required to complete before entering the UK nuclear industry and made clear their view that this process adequately addressed the potential for insider threats. This perspective also appeared to be influenced by length of service—several participants in our study had worked at the same facility in similar roles over their whole career. These interviewees made positive reference to a workplace with a “family-oriented culture” where “everyone knows everyone”—an intrinsically trusting environment, where it was difficult to envision an insider threat.Footnote78 This view is potentially problematic. Studies of threats across different industries show that future insiders may successfully complete initial vetting and only initiate malicious acts many years after commencing employment.Footnote79

Developing an equally realistic and detailed understanding of the threat across the different occupational groups in a nuclear organization’s workforce is clearly a challenging task. Misperceptions can occur at all levels, including at the very top, as highlighted in the aforementioned case of the ANC attack on the Koeberg Nuclear Power Plant. They may even occur among those directly responsible for security. A study by researchers from Harvard University, which surveyed nuclear-security managers, found that the majority viewed the threat posed by a group of insiders working together as either “not credible or modestly credible.”Footnote80 This view does not align with the evidence from past incidents, in nuclear and other industries, which shows multiple cooperative insiders to be a relatively common occurrence.Footnote81

Several approaches could be adopted to bridge these gaps. At the international and national levels, there is scope for increased information sharing on real-life nuclear-security incidents.Footnote82 In the United Kingdom, the NDA has established a Nuclear Security Forum to this end, as part of which nuclear-security managers regularly meet to discuss incidents and near misses. For broader awareness-raising activities and security training, there is clearly scope for the use of both nonfictional and realistic fictional scenarios, tailored to different audiences and delivered in a way that stimulates active engagement. In our study, several participants noted that security-related training can be jargon-heavy and difficult to relate to their specific roles. A good example of efforts to address this issue is the theater-based training workshops developed by WINS, the NDA, and others, which seek to provide a more accessible and interactive means of exploring nuclear-security incidents.Footnote83

Some determinants of nuclear-security-related behavior

A belief in and understanding of the threat are not the only drivers of employee behavior. Our research also highlights the important influence that broader organizational factors can have and the utility of IAEA’s model of nuclear-security culture in interpreting what, at first sight, might seem to be surprising behaviors.

A close adherence to procedure is clearly fundamental to the effective implementation of nuclear-security programs. The security liabilities resulting from a lack of adherence were vividly highlighted in the real-life case studies discussed earlier, with serious consequences. Yet studies have shown that it is common for organizations to struggle with issues related to noncompliance in different aspects of their business, security included.Footnote84 Consequently, many of the international efforts to assess nuclear-security culture have sought to understand what drives adherence. This was also a significant focus of our study.

Clearly, many factors can reduce adherence to security procedures. Interviewees cited incomplete understanding of these procedures and inconsistency in how they were enforced, including the nature of penalties for noncompliance. Arguably the most pressing issue raised was the perception that security was at times an unnecessary hindrance that significantly undermined productivity. These attitudes and behaviors may be partly explained by the aforementioned threat perceptions exhibited by some interviewees. Yet there were also other significant influences, particularly the processes for developing security procedures and communicating them to staff.

The majority of interviewees expressed some frustration with the process for design and deployment of security measures and policies. The most common complaint was that managers failed to recognize how security measures impinged on daily operations. While the majority of these measures and policies were ultimately adapted to resolve implementation issues, they nevertheless created periods of tension and uncertainty until more appropriate procedures were worked out. Until this point, employees adopted what have been described as “shadow security” practices—workarounds that represent a compromise between “getting the job done and managing … risks.”Footnote85 These problems were accentuated by what many interviewees perceived as a “broad-brush” approach to security in their organizations that did not sufficiently account for differences in the sensitivity of assets that needed protecting. In their opinion, security measures had gone too far in several areas, stifling productivity.Footnote86

These attitudes were reinforced by shortcomings that employees saw in the communications process. Many employees viewed it as a top-down, one-way flow of information, underpinned by what one person described as an attitude of “you do what you are told and you don’t question it.”Footnote87 Another interviewee spoke of a “military mindset” adopted by certain senior personnel.Footnote88 The general sentiment was that this approach was counterproductive in a corporate environment.Footnote89 Several interviewees felt it was important to strike a balance between protecting potentially sensitive information and sharing information on nuclear-security decisions to enhance buy-in and understanding among staff. In the nuclear sector, the balance is arguably still “weighted very much towards the restriction of information,” not least because of a pre-existing culture of secrecy around nuclear security that reflects the strategic dual-use nature of nuclear technology.Footnote90 Moving more “towards the sharing end of the spectrum” could have benefits for nuclear-security practice.Footnote91

Many of the issues outlined here can be addressed through the approaches outlined in IAEA guidance on nuclear-security culture. For example, IAEA guidance speaks of “involv[ing] staff members in the risk assessment and decision making processes” for security decisions, as well as “explain[ing] the context for issues and decisions,” “welcom[ing] staff input,” and “allowing effective two way communication.”Footnote92 The organizations involved in our study subsequently launched a process of staff consultation, with a wide range of employees feeding into the revision of security policies and processes. These organizations adopted strategies to continuously gather staff input and stimulate discussion with regard to security. These strategies include extending so-called “safety shares”—a time set aside to discuss safety-related issues at the start of important meetings—to include security issues, as well as the use of systems for collecting employee feedback.Footnote93

Another key security-related behavior, widely seen as essential to the effective implementation of nuclear security, is that of vigilance. A lack of vigilance can dramatically diminish the effectiveness of security measures, as illustrated in the real-life cases discussed earlier, where clear opportunities to counter unauthorized acts were missed. Facility staff are increasingly expected to identify and report a spectrum of behaviors, from colleagues’ avoidance of security processes to actions suggesting more malicious intent. However, developing a security-vigilant workforce is far from straightforward. Personnel must be trained to notice and question unusual occurrences, assess their security significance, proactively challenge their colleagues in an appropriate and respectful manner, and, if necessary, report potential infractions. All of this activity has the potential to create tensions in a workplace, which, if not appropriately managed, can reduce the effectiveness of programs designed to encourage vigilance. The challenge of promoting an open working environment where staff are able to raise concerns without fearing retribution was highlighted in a 2016 report by the US Department of Energy, which emphasized the need to strengthen protection for whistleblowers within the nuclear sector.Footnote94

On this point, our research revealed that the idea of individuals raising potential security concerns was widely perceived by colleagues and management as important and beneficial. Most interviewees referenced multiple occasions where they had either challenged someone or been challenged themselves over a potential security issue. Relatively junior staff also felt comfortable challenging senior managers on adherence to security-related protocols when necessary. This approach had been promoted through the NDA’s “See Something, Say Something” event-reporting process, introduced in 2015, which initially focused on safety but then expanded to include security.Footnote95 The influence of this approach was clear, with interviewees describing well-established reporting systems within their organizations that were familiar to and used by employees. The majority of interviewees were also aware of other options open to them, including the ability report directly to the UK Office for Nuclear Regulation.

Despite these positives, it is nevertheless somewhat unclear how effective employees would be at identifying malicious insiders, since most research suggests that the approach of these adversaries will be carefully planned and will rely heavily on concealment.Footnote96 Moreover, consistent with the threat perceptions discussed earlier, most interviewees focused on the importance of identifying and challenging potential external adversaries, such as individuals they did not recognize and who might not have received authorization to be in their working area. At the same time, there is potential to build on existing systems for raising concerns with additional training and awareness raising focused on insider threats and their potential indicators.

How to assess nuclear-security culture?

In the aftermath of a serious incident, it should be possible to retrospectively identify weaknesses in security implementation. The discovery of potentially serious issues in advance is far more challenging. Performance-testing exercises are commonly used within the industry to probe the effectiveness of nuclear-security systems. While these methods may identify certain deficiencies, they can be difficult to implement and may not provide detailed information on underlying causes. The limitations of these approaches were highlighted in the aforementioned Y-12 incursion, where testing and assurance mechanisms used in advance of the incident led the US National Nuclear Security Administration to falsely conclude that the security systems at the facility were “one of the most innovative and higher performing” within their complex.Footnote97

IAEA guidance advocates the use of a “wide range of tools, including survey, interview, document review and observation” to gather data on the “30 characteristics of nuclear security culture included in the IAEA model.”Footnote98 It also recommends that assessments should occur regularly, be tailored to specific facilities, involve security and non-security personnel, take place iteratively, and draw on the expertise of outside specialists in their design and delivery.Footnote99 This is a comprehensive approach for exploring organizational culture, but it has yet to be fully adopted within the nuclear industry concerning security. For example, of the tens of nuclear-security-culture assessments reported upon to date, many have used just one or two assessment methodologies, with most drawing conclusions based largely on survey results.Footnote100

Yet the analyses in this article and elsewhere have shown that, despite its widespread use, a purely survey-based approach to security culture assessment, no matter how well constructed, will produce insights that are limited.Footnote101 In the United Kingdom, a suite of Security Culture Assessment Tools (SeCuRE), developed by the Centre for the Protection of Critical National Infrastructure and comprising multiple surveys, is used across a range of industries “to measure aspects of their security culture.”Footnote102 This tool set has been refined over many years and has been used in the nuclear sector for more than a decade. In the case of SeCuRE, several nuclear-security managers interviewed noted that although their organizations tended to score highly, they did not believe that these results fully reflected reality. Instead, they perceived high scores as a result of SeCuRE’s development for use across the United Kingdom’s 13 critical national infrastructure (CNI) sectors. Given the strong emphasis placed on security within the nuclear industry compared to some other CNI sectors, interviewees pointed out that nuclear organizations will inevitably score highly.

Surveys are a fundamental component of any organizational-culture assessment. However, while they may be able to identify potential issues in nuclear security, they do not necessarily shed light on underlying causes. Using complementary methods, such as interviews, can provide a deeper understanding of the why as well as the what. For example, our interview-based study revealed a number of underlying and interrelated reasons for problems encountered by employees related to the classification and control of information. Challenges included overly bureaucratic systems for obtaining sensitive information, inadequate training, and inconsistent labeling practices. Interviews both helped to identify these issues and allowed their significance to be understood. For example, interviews revealed how an individual’s lack of understanding of how to segregate sensitive from nonsensitive information could result in the overclassification of documents in an attempt to be “safe rather than sorry.”Footnote103 Despite good intentions, these types of action led to situations where nonsensitive documents could not be shared with colleagues with lower levels of clearance. These more detailed insights would have been difficult to uncover through a survey alone.

Recipients may also perceive the completion of a survey as a time-consuming and thankless task. In commercial environments where a premium is placed on productivity, surveys—particularly if they are frequently distributed for many purposes within an organization—can become a box-checking exercise, serving to reinforce the expectations of management. This was the view of a number of interviewees in our study, one of whom noted that “with the surveys, a lot of it is you telling them what they want to hear.”Footnote104 In contrast, the conversational aspect of interviews, combined with the rapport that interviewers seek to build with subjects, can uncover insights beyond formulaic or contrived responses. To achieve this rapport, however, interviews must be conducted by specialists with experience in teasing out what a subject truly thinks. This has proved challenging for nuclear-security-culture self-assessments. For example, an internal review of a nuclear-security-culture assessment carried out by Indonesia’s BATAN in 2012 noted that the insights obtained from interviews were diminished by a “lack of interview skills.”Footnote105 Adhering only to predetermined questions rendered them in effect oral surveys, with BATAN noting that “in retrospect … the choice of a structured interview was too limiting.”Footnote106

On a more positive note, most assessments have used the IAEA’s model of nuclear-security culture in a flexible rather than prescriptive manner. This approach recognizes that it is not possible to incorporate all organizational characteristics and their associated indicators into an assessment of nuclear-security culture. Instead, IAEA guidance has been adapted to suit the needs and areas of concern of different organizations. For example, in the aforementioned 2012 BATAN assessment, a decision was taken by the assessment team to concentrate on one key behavioral aspect, “adherence to procedures,” in order to “avoid asking more than 300 indicator questions.”Footnote107 In a subsequent assessment in by BATAN in 2015, “twelve out of thirty characteristics from the IAEA list of elements of security culture were selected based on factors which were considered most important for the implementation of nuclear security.”Footnote108 Similarly, in a 2016 security-culture assessment at a number of hospitals in Malaysia containing radioactive materials, 50 of the IAEA’s 300 indicators were selected, and questions were rephrased “in ways that would be easily understood.”Footnote109

Key findings and areas for further exploration

The above discussion reflects the focus of our study, which examined security culture at a handful of nuclear sites within the United Kingdom. Other facilities may face different challenges. Nevertheless, a number of the study’s lessons are broadly applicable to nuclear organizations developing their own nuclear-security-culture programs.

First, it is important to consider whether and to what extent employees’ threat perceptions diverge from the best available assessments, and how these perceptions may vary across different occupational groups. If this divergence is not identified and addressed, it can undermine broader efforts to improve nuclear-security practice, particularly preventative security measures.

Second, management systems and leadership behaviors play a key role in shaping employee engagement with security. The nature of this influence is complex and multifaceted and will depend on the broader organizational context within which actions are taken. IAEA guidance can be useful for understanding security-related behaviors, while also providing suggestions for improvement of leadership and management practices. Third, given the secrecy still associated with security within the global nuclear industry, many organizations are likely to benefit from greater transparency. Increasing discussions of decisions about nuclear security with non-security personnel may improve engagement and buy-in, as well as support the development of effective solutions that balance security, operational, commercial, and other considerations.

Finally, there is considerable scope for and value in moving beyond the usual approach of largely ad hoc, survey-based nuclear-security-culture assessments. A more rigorous approach is required that engages with the full suite of methodologies recommended in IAEA guidance. This will allow for a deeper understanding of nuclear-security culture within an organization and how it can be strengthened.

Despite our conducting over 40 in-depth interviews, there remain many facets of nuclear security culture that were not possible to explore in depth. One example is the influence of what Schein calls occupational cultures, which are rooted in groups’ different professional backgrounds. IAEA guidance refers to security and non-security subcultures. However, our study suggests that it may be necessary to account for a wider range of groups. An occupational group of particular interest was contractors—temporary staff brought in to complete a specific task, typically employed through a third party. Further research on the dynamics of contractor engagement with a facility’s broader security culture would undoubtedly offer valuable lessons. Our research also hinted at tensions between permanent staff and contractors at certain sites, with permanent staff viewing contractors as overly focused on wanting to get the job done “as quickly as possible.”Footnote110 Conversely, one contractor interviewed claimed that many in this occupational group view permanent employees as obstructive: “[permanent staff] love to make it difficult for us to do our job.”Footnote111 It was clear that dialogue between these two groups could be improved.

Conclusion

The recognition of the human dimension in nuclear security has gained momentum less quickly than it has in nuclear safety. One likely reason is the absence of recent major forcing events such as Chernobyl and Three Mile Island, which led to significant changes in the realm of safety. Yet the human dimension is no less important in the security realm, with a number of serious incidents demonstrating how adversaries take advantage of weaknesses in internal security implementation. For both nuclear safety and security, the human dimension is now widely viewed through the prism of culture, and anchored in conceptual frameworks drawn from management studies. The IAEA model and associated guidance provide a useful and relatively comprehensive tool kit for identifying and understanding the factors influencing employee behaviors toward security within nuclear organizations. This approach must be tailored to different national and operational environments, but its overall value is clear.

International efforts to promote the importance of nuclear-security culture at the operational level are now gaining momentum, with an upsurge over the past decade in initiatives aimed at understanding and strengthening the human factor and placing greater emphasis on the active involvement of all personnel in security. Yet there is more work needed to elevate security culture to the same level as safety culture within the nuclear industry. Future efforts in this area should seek to further promote information sharing and dialogue about both the nature of security threats and the thinking behind nuclear-security decision making.

Nuclear organizations should also draw on their experience in establishing nuclear-safety-culture programs. Methods for assessing and promoting safety culture are likely to be effective when applied to security. In our UK-focused study, several safety-culture systems had been successfully extended to incorporate security. There is also scope for new studies focused on extracting and sharing the practical lessons from establishing programs to strengthen nuclear-security culture. These studies could support other nuclear organizations looking to initiate new programs in this area. To date, studies have tended to focus on cases of “worst practices,” which help to raise awareness of the importance of security culture, but provide only limited insights into how to develop successful programs.

Acknowledgments

The authors are grateful to Paul Tonks, head of security and resilience at the UK Nuclear Decommissioning Authority, for sharing his expertise and insights and for his help in facilitating the UK-based interviews, as well as to the reviewers and editor for their helpful comments and suggestions.

Notes

1 William Tobey, “Planning for Success at the 2014 Nuclear Security Summit,” Stanley Foundation, December 2013, p. 5, <https://stanleycenter.org/wp-content/uploads/2019/09/TobeyPAB1213a.pdf>.

2 Matthew Bunn, “Reducing the greatest risks of nuclear theft and terrorism,” Daedalus, Vol. 138, No. 4 (Fall 2009), p. 116.

3 Centre for the Protection of National Infrastructure, “Personnel and People Security,” last updated August 9, 2021, <https://www.cpni.gov.uk/personnel-and-people-security>; International Atomic Energy Agency (IAEA), Preventative and Protective Measures Against Insider Threats: Implementing Guide, IAEA Nuclear Security Series No. 8-G (Rev. 1) (Vienna: International Atomic Energy Agency, 2008), < https://www-pub.iaea.org/MTCD/Publications/PDF/PUB1858_web.pdf>.

4 (UK) Office for Nuclear Regulation, “Security Assessment Principles for the Civil Nuclear Industry,” 2017, pp. 53–54, <http://www.onr.org.uk/syaps/security-assessment-principles-2017.pdf>.

5 White House, Office of the Press Secretary, “Key Facts About the Nuclear Security Summits,” April 13, 2010, <https://obamawhitehouse.archives.gov/the-press-office/key-facts-about-nuclear-security-summit>.

6 State commitments included establishing new “Centers of Excellence” for nuclear-security education and training, inviting international reviews of their nuclear-security systems, and launching security-culture self-assessments.

7 This study was undertaken by the authors from July 2017 to April 2018 in support of a broader exercise to assess security culture within nuclear organizations managed by the NDA.

8 National Consortium for the Study of Terrorism and Responses to Terrorism, “Nuclear Facilities Attack Database,” n.d., <https://www.start.umd.edu/nuclear-facilities-attack-database-nufad>.

9 Mary Lynn Garcia, “Introduction to Vulnerability Assessment,” in Lawrence J. Fennelly, ed., Effective Physical Security, 4th ed. (Waltham, MA: Butterworth-Heinemann, 2013), p. 21.

10 IAEA, “Nuclear Security Recommendations on the Physical Protection of Nuclear Material and Facilities,” INFCIRC/225/Revision 5, Jan. 2011, pp. 58-59, <https://www-pub.iaea.org/MTCD/Publications/PDF/Pub1481_web.pdf>.

11 For a detailed overview of this incident, see Jeffrey T. Richardson, Defusing Armageddon: Inside NEST, America’s Secret Nuclear Bomb Squad (New York: Norton Press, 2009), pp. 37–42.

12 US Nuclear Regulatory Commission (NRC), “Attempted Extortion: Low Enriched Uranium,” IE Circular No. 79-08, May 18, 1979 (page last reviewed/updated March 25, 2021), p. 2, <https://www.nrc.gov/reading-rm/doc-collections/gen-comm/circulars/1979/cr79008.html>.

13 US NRC, “Attempted Extortion,” p. 2.

14 Richardson, Defusing Armageddon, pp. 41–42.

15 David Beresford, Truth is a Strange Fruit: A Personal Journey Through the Apartheid War (Johannesburg: Jacana Media, 2010), p. 105.

16 Jo-Ansie van Wyk, “Nuclear terrorism in Africa: The ANC’s Operation Mac and the attack on the Koeberg Nuclear Power Station in South Africa,” Historia, Vol. 60, No. 2 (November 2015), p. 53, <http://dx.doi.org/10.17159/2309-8392/2015/V60N2A3>.

17 This had become clear following the earlier theft of site plans for the plant by Renfrew Christie, who had been jailed in 1979 for handing these over to the ANC. Stuart Murray, Koeberg: Eskom’s Nuclear Success Story (Rondebosch, South Africa: Churchill Murray Publications, 1994).

18 Murray, Koeberg.

19 Murray, p. 73.

20 Eric Schlosser, “Break-in at Y-12,” New Yorker, March 2, 2015, <https://www.newyorker.com/magazine/2015/03/09/break-in-at-y-12>.

21 Y-12 National Security Complex, “Highly Enriched Uranium Materials Facility,” US Department of Energy (DoE), n.d., <https://www.y12.doe.gov/about/transforming-y-12/highly-enriched-uranium-materials-facility>.

22 US House, Committee on Armed Services, “Nuclear Security: Actions, Accountability and Reform,” hearing before the Subcommittee on Strategic Forces of the Committee on Armed Services, 113th Congress, 1stSession, H.A.S.C. No. 113-13, February 28, 2013, p. 5, <https://www.govinfo.gov/content/pkg/CHRG-113hhrg79996/pdf/CHRG-113hhrg79996.pdf>.

23 US House, Committee on Armed Services, “Nuclear Security,” p. 2.

24 US House, Committee on Armed Services, “Nuclear Security,” pp. 2–3; US DoE, Office of Inspector General, “Special Report: Inquiry into the Security Breach at the National Nuclear Security Administration's Y-12 National Security Complex,” DOE/IG-0868, August 2012, p. 7, <https://www.energy.gov/sites/prod/files/IG-0868_0.pdf >; US House, Committee on Armed Services, “Y-12 Intrusion: Investigation, Response, and Accountability,” hearing before the Subcommittee on Strategic Forces of the Committee on Armed Services, 112th Congress, 2nd Session, H.A.S.C. No. 112–156, September 13, 2013, p. 26.

25 US House, Committee on Armed Services, “Y-12 Intrusion,” p. 3.

26 DoE, Office of Inspector General, “Special Report,” p. 2.

27 DoE, Office of Inspector General, “Special Report,” p. 4.

28 US House, Committee on Armed Services, “Y-12 Intrusion,” p. 26.

29 Dan Zak, “The Prophets of Oak Ridge,” Washington Post, April 30, 2013, <https://www.washingtonpost.com/sf/wp-style/2013/09/13/the-prophets-of-oak-ridge>.

30 Sarah Mullen, “Generic Adversary Characteristics and the Potential Insider Threat to Licensed Nuclear Activities from Insiders,” US NRC, AP-POO3 374, 1981, p. 103, < https://www.hsdl.org/?view&did=11606>.

31 [Johannesburg] Sunday Times, “The Swordsman and the Bomb,” August 29, 2010, <https://www.pressreader.com/south-africa/sunday-times-1107/20100829/281719790897078>.

32 Wilmington Morning Star, “Dale gets 15 Years for Uranium Plot,” May 9, 1979, p. 2.

33 Matthew Bunn and Scott D. Sagan, “A Worst Practices Guide to Insider Threats,” in Matthew Bunn and Scott Sagan, eds., Insider Threats (Ithaca, NY: Cornell University Press, 2016), p. 153.

34 A. D. Swain and H. E. Guttmann, “Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications: Final Report,” Sandia National Laboratories (for the US NRC), NUREG/CR- 1278, August 1983, p. 3, <https://www.nrc.gov/docs/ML0712/ML071210299.pdf>.

35 Thomas B. Sheridan, “Forty-five Years of Man–Machine Systems: History and Trends,” IFAC [International Federation of Automatic Control], Proceedings Volumes, Vol. 18, No. 10 (September 1985), p. 1, <doi.org/10.1016/S1474-6670(17)60193-9>.

36 Thomas B. Sheridan, “Risk, Human Error, and System Resilience: Fundamental Ideas,” Human Factors: The Journal of the Human Factors and Ergonomics Society, Vol. 50, No. 3 (June 2008), p. 418, <doi.org/10.1518/001872008X250773>.

37 Mitchell Rogovin and George T. Frampton, “Three Mile Island: A Report to the Commissioners and to the Public,” Vol. 1, Special Inquiry Group, US NRC, 1980, p. 122, <https://www.osti.gov/servlets/purl/5395798>.

38 President’s Commission on the Accident at TMI [Three Mile Island], (John G. Kemeny, chairman), Report of the President’s Commission on the Accident at Three Mile Island: The Need for Change: The Legacy of TMI, October 1979, p. 2 (emphasis added), <http://large.stanford.edu/courses/2012/ph241/tran1/docs/188.pdf>.

39 President’s Commission on the Accident at TMI, “Report of the President’s Commission,” p. 9.

40 International Nuclear Safety Advisor Group (INSAG), Summary Report on the Post-Accident Review Meeting on the Chernobyl Accident - Safety Series No.75-INSAG-1 (Vienna: International Atomic Energy Agency, 1986), p.76.

41 INSAG, p. 9 (emphasis added).

42 INSAG, Summary Report, p. 76.

43 Terry Kuykendall and Igor Khripunov, with Jason Lowe, “Harmonizing Nuclear Safety Culture and Security Culture: Final Report on the Proceedings of the International Workshop in Serpong, Indonesia, 29-31 January 2018,” Center for International Trade and Security, June 2018, p. 4, <https://www.researchgate.net/profile/Igor-Khripunov/publication/325908800_Harmonizing_Safety_and_Security_Culture_at_Nuclear_Facilities_Report_on_the_International_Workshop/links/5b2c12174585150d23c1a948/Harmonizing-Safety-and-Security-Culture-at-Nuclear-Facilities-Report-on-the-International-Workshop.pdf>.

44 William H. Sewell, Jr., “The Concept(s) of Culture,” in Gabrielle M. Spiegel, ed., Practicing History: New Directions in Historical Writing after the Linguistic Turn (London: Routledge, 2004), p. 77.

45 Sewell, “The Concept(s) of Culture,” p. 77.

46 Susan Wright, “The politicization of culture,” Anthropology Today, Vol. 14, No. 1 (February 1998), p. 11, <http://ruraleconomics.fib.ugm.ac.id/wp-content/uploads/Susan-Wright-The-Politicisation-of-Culture.pdf>.

47 IAEA, “Regulatory Oversight of Safety Culture in Nuclear Installations,” IAEA-TECDOC-1707 (Vienna: International Atomic Energy Agency, 2013), p. 3, <https://www-pub.iaea.org/MTCD/Publications/PDF/TE_1707_CD/PDF/TECDOC_1707.pdf>.

48 IAEA, “Nuclear Verification and Security of Material. Physical Protection Objectives and Fundamental Principles,” GOV/2001/41, August 15, 2001. The document is an attachment to IAEA Board of Governors, “Measures to Improve the Security of Nuclear Materials and Other Radioactive Materials,” GC(45)/INF/14, September 14, 2001, <https://www.iaea.org/sites/default/files/gc/gc45inf-14_en.pdf>.

49 IAEA, “Nuclear Verification,” Attachment C, p. 1.

50 IAEA, “Nuclear Verification,” Attachment C, p. 3.

51 Nuclear-security culture was also the focus of considerable state-level effort during this period, particularly in the United States, but development of the concept in the context of the IAEA arguably gives a better insight into its broader, international evolution over time.

52 IAEA, “Code of Conduct on the Safety and Security of Radioactive Sources,” IAEA/CODEOC/2004, 2004, p. 117, <https://www-pub.iaea.org/MTCD/publications/PDF/Code-2004_web.pdf>.

53 IAEA, “Nuclear Security: Global Directions for the Future,” proceedings of an international conference, London, 16–18 March 2005, <https://www-pub.iaea.org/MTCD/Publications/PDF/Pub1232_web.pdf>.

54 IAEA, “Nuclear Security,” p. 99.

55 The amendment to the CPPNM entered into force in 2016. See IAEA, “Amendment to the Convention on the Physical Protection of Nuclear Material,” INFCIRC/274/Rev.1/Mod.1, May 9, 2016, <https://www.iaea.org/sites/default/files/infcirc274r1m1.pdf>.

56 IAEA, Nuclear Security Culture: Implementing Guide, IAEA Nuclear Security Series No. 7 (Vienna: International Atomic Energy Agency, 2008).

57 IAEA, Nuclear Security Culture, p. 5.

58 Igor Khripunov, “A culture of security: Focus for the next Nuclear Security Summit?,” Bulletin of the Atomic Scientists, June 26, 2015, <https://thebulletin.org/2015/06/a-culture-of-security-focus-for-the-next-nuclear-security-summit/>.

59 Khripunov, “A culture of security.”

60 IAEA, Self-assessment of Nuclear Security Culture in Facilities and ActivitiesNuclear Security Series No. 28-T (Vienna: International Atomic Energy Agency, 2017).

61 Khripunov, “A culture of security.”

62 WINS, “Nuclear Security Culture—A World Institute for Nuclear Security Best Practice Guide,” February 2019 (available to WINS members); (UK) Nuclear Industry Safety Directors' Forum, “Key Attributes of an Excellent Nuclear Security Culture, June 2013, <https://www.nuclearinst.com/write/MediaUploads/SDF%20documents/Security/Key_attributes_of_an_excellent_Nuclear_Security_Culture.pdf>.

63 IAEA, Self-assessment of Nuclear Security Culture, p. 36.

64 William G. Tierney, “Organizational Culture and Leadership,” Academy of Management Review, Vol. 11, No. 3 (1986), p. 17, <https://www.jstor.org/stable/pdf/258322.pdf>.

65 Edgar H. Schein, Organizational Culture and Leadership, 3rd edn., (San Francisco: Jossey-Bass, 2004), p. 16.

66 Edgar H. Schein, “National and Occupational Culture Factors in Safety Culture,” revised draft for IAEA meeting, April 9, 2014, p. 3, <https://gnssn.iaea.org/NSNI/SC/WS_GSC/National%20and%20Occupational%20Culture%20Factors%20in%20Safety%20Culture_Prof%20Edgar%20H%20Schein.pdf>.

67 Schein, Organizational Culture and Leadership, p. 24.

68 IAEA, Nuclear Security Culture: Implementing Guide, p. 19.

69 Tierney, “Organizational Culture and Leadership,” p. 677.

70 IAEA, Self-assessment of Nuclear Security Culture, pp. 41–60.

71 IAEA, Nuclear Security Culture: Implementing Guide, p. 19.

72 The authors conducted 43 interviews with a wide range of staff within nuclear organizations managed by the NDA, including nuclear power plants, facilities with Category I nuclear material (the types of nuclear material requiring the highest level of physical protection), and low-level-waste repositories. The interviews were carried out as part of a broader exercise to assess security culture within the NDA.

73 Examples in the area of computer security include H. Liang and Y. Xue, "Understanding Security Behaviors in Personal Computer Usage: A Threat Avoidance Perspective," Journal of the Association for Information Systems, Vol. 11, No. 7 (July 2010), pp. 394–413, <https://www.researchgate.net/profile/Huigang-Liang/publication/220580586_Understanding_Security_Behaviors_in_Personal_Computer_Usage_A_Threat_Avoidance_Perspective/links/55b6fef008aec0e5f4380213/Understanding-Security-Behaviors-in-Personal-Computer-Usage-A-Threat-Avoidance-Perspective.pdf>.; and L. Zhang and W. McDowell, "Am I Really at Risk? Determinants of Online Users' Intentions to Use Strong Passwords," Journal of Internet Commerce, Vol. 8, No. 3 (July 2009), pp. 180–197, doi: 10.1080/15332860903467508.

74 IAEA, Nuclear Security Culture: Implementing Guide, p. 19.

75 IAEA, Development, Use and Maintenance of the Design Basis Threat: Implementing Guide, IAEA Nuclear Security Series No. 10 (Vienna: International Atomic Energy Agency, 2009).

76 Matthew Bunn and Kathryn M. Glynn, “Preventing Insider Theft: Lessons from the Casino and Pharmaceutical Industries,” Journal of Nuclear Materials Management, Vol. 41, No. 3 (Spring 2013), pp. 4–16, <https://dash.harvard.edu/bitstream/handle/1/10861136/Preventing%20Insider%20Theft-V%2041_3.pdf>.

77 Björn Lindström and Armita Golkar, with Simon Jangard, Philippe N. Tobler, and Andreas Olsson, “Social threat learning transfers to decision making in humans,” Proceedings of the National Academy of Sciences of the United States of America, Vol. 116, No. 10 (March 5, 2019), pp. 4732–4737, <https://www.pnas.org/content/pnas/116/10/4732.full.pdf>; Joshua Wood, “Framing terror: an experimental framing effects study of the perceived threat of terrorism,” Critical Studies on Terrorism, Vol. 4, No. 2 (August 2011), pp. 119–217, <https://www.tandfonline.com/doi/abs/10.1080/17539153.2011.586205>.

78 Company director, nuclear operator (name and organization withheld by request), UK NDA, in-person interview with the authors, Cumbria, November 1, 2017.

79 See, for example, a US NRC study that found that the majority of insider thefts occurred more than five years after the start of employment. Mullen, “Generic Adversary Characteristics,” p. 103.

80 Matthew Bunn and Eben Harrell, “Threat Perceptions and Drivers of Change in Nuclear Security Around the World: Results of a Survey,” Belfer Center for Science and International Affairs, March 2014, pp. 4–6, <https://www.belfercenter.org/publication/threat-perceptions-and-drivers-change-nuclear-security-around-world-results-survey>.

81 Mullen, “Generic Adversary Characteristics,” p. 102.

82 Bunn and Glynn, “Preventing Insider Theft,” p. 16.

83 WINS, “ICONS: Nuclear Security Takes Centre Stage on February 12,” World Institute for Nuclear Security, December 12, 2019, <https://wins.org/nuclear-security-takes-centre-stage/>; Jongsook Kim and Hyung-kyung Lee, with Jin-young Lee, “Challenges and Responses for Ensuring Sustainability of INSA Training Programs,” International Journal of Nuclear Security, Vol. 2, No. 1 (2016), p. 4, <https://trace.tennessee.edu/cgi/viewcontent.cgi?article=1039&context=ijns>.

84 Maurizio Cavallari, “A Conceptual Analysis about the Organizational Impact of Compliance on Information Security Policy,” Exploring Services Science Third International Conference, IESS 2012, Geneva, Switzerland, February 15–17, 2012, pp. 101–102, doi: 10.1007/978-3-642-28227-0_8.

85 Iacovos Kirlappos and Simon Parkin, with M. Angela Sasse, “Learning from ‘Shadow Security’: Why understanding non-compliant behaviors provides the basis for effective security,” paper presented at the Workshop on Usable Security, San Diego, CA, USA, February 23, 2014, <https://discovery.ucl.ac.uk/id/eprint/1424472/1/Kirlappos%20et%20al.%20-%202014%20-%20Learning%20from%20%E2%80%9CShadow%20Security%E2%80%9D%20Why%20understanding.pdf>. (For the full workshop program, see <https://www.ndss-symposium.org/ndss2014/workshop-usable-security-usec-2014-programme/>.)

86 Business officer, nuclear operator (name and organization withheld by request), UK NDA, in-person interview with the authors, Cumbria, January 16, 2018.

87 Finance officer, nuclear operator (name and organization withheld by request), UK NDA, in-person interview with the authors, Cumbria, November 1, 2017.

88 Personnel manager, nuclear operator (name and organization withheld by request), UK NDA, in-person interview with the authors, Cumbria, September 19, 2017.

89 This was not the case in all organizations, with other interviewees noting that members of the security team were increasingly approachable and willing to have detailed discussions about security procedures.

90 Wyn Q. Bowen and Christopher Hobbs, “Sensitive Nuclear Information: Challenges and Options for Control,” Strategic Analysis, Vol. 38, No. 2 (March 2014), p. 225, <https://www.tandfonline.com/doi/abs/10.1080/09700161.2014.884441>.

91 Bowen and Hobbs, “Sensitive Nuclear Information,” p. 225.

92 IAEA, Nuclear Security Culture: Implementing Guide, pp. 32–33.

93 Security manager, nuclear operator (name and organization withheld by request), UK NDA, in-person interview with the authors, Cumbria, December 9, 2019.

94 US Government Accountability Office, “Department of Energy: Whistleblower Protections Need Strengthening,” US Government Accountability Office, GAO-16-618, July 11, 2016, pp. 1–66, <https://www.gao.gov/products/gao-16-618>.

96 Mullen, “Generic Adversary Characteristics,” p. 103; R. N. Reinstedt and Judith Westbury, “Major Crimes as Analogs to Potential Threats to Nuclear Facilities and Programs,” Rand, N-1498-SL, April 1980, <https://www.rand.org/content/dam/rand/pubs/notes/2009/N1498.pdf>.

97 DoE, Office of Inspector General, “Special Report,” p. 7.

98 IAEA, Self-assessment of Nuclear Security Culture, pp. 2, 11.

99 IAEA, Self-assessment of Nuclear Security Culture, pp. 9, 11.

100 Vladimir Yankov and Seema Gahlaut, “Stakeholder: Hands on Experience in Culture and Future Strategies,” in Julia Thompson and Seema Gahlaut, eds., CBRN Security Culture in Practice, NATO Science for Peace and Security Series (IOS Press, 2015), pp. 59–61; Carsten Speicher and Igor Khripinov, with Nataliia Klos, and Jan Soderman, “Assessment of Security Culture at Nuclear Power Plants,” in Thompson and Gahlaut, CBRN Security Culture in Practice), pp. 139–141.

101 I. Khripunov, and P. Khairul, with D. Ebel and D. Nikonov, “Assessing Nuclear Security Culture: The Experience of Indonesia,” paper presented at the International Nuclear Materials Management annual meeting, Atlanta, United States, July 20–24, 2013.

102 Centre for the Protection of National Infrastructure, “Introduction to SeCuRE 4,” Centre for the Protection of National Infrastructure, 2018, p. 1, <https://www.cpni.gov.uk/system/files/documents/5b/c9/Introduction_to_SeCuRE_4.pdf>.

103 Security officer, nuclear operator (name and organization withheld by request), UK NDA, in-person interview with authors, Cumbria, January 16, 2018.

104 Security officer, nuclear operator (name and organization withheld by request), UK NDA, in-person interview with authors, Cumbria, August 8, 2017.

105 Khairul Khairul, “Nuclear Security Culture Self Assessments at Nuclear Research Reactors: BATAN’s Experiences,” paper presented at the Insider Threat Mitigation Symposium, Brussels, Belgium, March 12–14, 2019, <http://insiderthreatmitigation.org/assets/docs/presentations/12Mar-KKhairul-StratChall4.pdf>; Anhar Riza Antariksawan and Khairul Khairul, with Heru Umbara, Endang Kristuti, and Bayu Purnomo, "Conducting Nuclear Security Culture Self-Assessments in Nuclear Research Facilities Using the IAEA Methodology," International Journal of Nuclear Security, Vol. 4, No. 1 (June 2018), p. 12, <https://trace.tennessee.edu/cgi/viewcontent.cgi?article=1080&context=ijns>.

106 Khripunov et al., “Assessing Nuclear Security Culture,” p. 3.

107 Khripunov et al., “Assessing Nuclear Security Culture,” p. 3.

108 Antariksawan et al., "Conducting Nuclear Security Culture Self-Assessments," p. 3.

109 Paul E. Ebel and K. Khairul, with Igor Khripunov, Noor Syakeera Mukhelas, and Pirunthavany Muthuvela, “Self-Assessment of Security Culture for Users of Radioactive Sources,” presented at the Institute of Nuclear Materials Management Annual Meeting, Indian Wells, California, USA, July 12–16, 2015.

110 Security officer, nuclear operator (name and organization withheld by request), UK NDA, in-person interview with authors, Cumbria, January 16, 2018.

111 Contractor, nuclear operator (name and organization withheld by request), UK NDA, in-person interview with authors, Cumbria, January 16, 2018.