910
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Examining Online Behaviors of Violent and Non-Violent Right-Wing Extremists During Peak Posting Days

ORCID Icon
Received 24 Jan 2024, Accepted 19 Apr 2024, Published online: 05 May 2024

Abstract

Despite the ongoing need for practitioners to identify violent extremists online before their engagement in violence offline, little is empirically known about their digital footprints generally or differences in their posting behaviors compared to their non-violent counterparts particularly – especially on high-frequency posting days. Content analysis was used to examine postings from a unique sample of violent and non-violent right-wing extremists as well as from a sample of postings within a sub-forum of the largest white supremacy forum during peak and non-peak posting days for comparison purposes. Several noteworthy posting behaviors were identified that may assist in identifying credible threats online.

This exploratory study is derived from a larger research project on the online behavioral posting patterns of violent and non-violent right-wing extremists (RWEs)Footnote1 within a sub-forum of one of the largest and well-known white supremacy web forums, Stormfront. This study expands prior research by examining the online behaviors of violent and non-violent RWEs during peak posting days (i.e. days comprising the highest frequency of postings) and non-peak posting days (i.e. days that do not comprise the highest frequency of postings). A sample of postings from the RWE sub-forum was also analyzed for comparison purposes, as little is empirically known about whether the online behaviors of RWE users who post online in general differ from violent and non-violent RWEs, especially during high-frequency posting days. This study represents an original contribution to the academic literature on online terrorism and extremism on three fronts.

First, a primary concern for many researchers, practitioners, and policymakers is the role of the Internet in facilitating violent extremism and terrorism.Footnote2 Here questions are often focused on the impact of the offenders’ consumption of and networking around violent extremist online content in their acceptance of extremist ideology and/or their decision to engage in violent extremism and terrorism.Footnote3 Although growing attention has been given to identifying violent extremists online before their engagement in violence offline and analyzing their online presence,Footnote4 few empirically grounded analyses have addressed this key area of research. What the limited evidence does suggest is that practitioners and policymakers should not conceive violent extremism as an offline versus online dichotomy. To illustrate, Gill and CornerFootnote5 examined the behavioral underpinnings of lone-actor terrorists in the United States (U.S.) and Europe and found that the growth of the Internet altered their means of radicalization and attack learning. Gill and colleaguesFootnote6 examined the online behaviors of United Kingdom (U.K.)-based terrorists and similarly found that those who learned online were more likely than those who did not learn online to network and interact offline. Holbrook and TaylorFootnote7 assessed pre-arrest media usage of U.K.-based terrorists and found that they consumed a variety of media across a range of platforms as well as interacted both on- and offline. Lastly, Gaudette and colleaguesFootnote8 interviewed former violent RWEs and identified an important interaction between their on- and offline behaviors while involved in violent extremism that were entangled with their extremist activities, identities, and need for security. Despite these foundational studies, additional research is needed to connect the on- and offline worlds of violent extremists.Footnote9

Second, researchers, practitioners, and policymakers have paid close attention to the presence of violent extremists and terrorists online in recent years, with a particular emphasis on the role of the Internet in facilitating violent extremism and terrorism in generalFootnote10 and the digital behaviors of the extreme right in particular.Footnote11 This is largely the result of ongoing reports that some violent RWEs and terrorists were active online before their attacks.Footnote12 One of the most notable examples is that of 28-year-old Australian Brenton Tarrant who, before killing 51 people in two Christchurch, New Zealand mosques in 2019 and live-streaming his attack, announced his intentions on 8chan and produced a “manifesto” linked on the website.Footnote13 Similarly, 18-year-old Payton Gendron posted details of his plan to attack the Tops Friendly Markets store in Buffalo, NY in a private Discord chat room prior to killing 10 people as well as live-streamed parts of his attack on Twitch and allegedly wrote and posted a 180-page manifesto on 4chan.Footnote14

Understandably, researchers have focused on the activities of RWEs on various platforms including websites and discussion forums,Footnote15 mainstream social media sites such as Facebook,Footnote16 X (formerly Twitter),Footnote17 and YouTube,Footnote18 fringe platforms including 4chanFootnote19 and Gab,Footnote20 digital applications such as TikTokFootnote21 and Telegram,Footnote22 as well as in mainstream videogames.Footnote23 But most studies, similar to research on the causes of violent extremism and terrorism in general, lack comparison groups despite an urgent need to focus on comparative analyses and consider how violent extremists are different from non-violent extremists.Footnote24 In other words, empirical research has largely overlooked differences in the online behaviors of those who share extreme ideological beliefs but are violent or non-violent offline – few empirical studies have investigated online behaviors of violent and non-violent extremists. Footnote25 Holt and colleaguesFootnote26 examined the underlying theoretical assumptions evident in radicalization models through a case-study analysis of violent and non-violent extremists. Wolfowicz and colleaguesFootnote27 used a matched case-control design to differentiate between terrorists and non-violent radicals based on their social media profiles. Lastly, Scrivens and colleaguesFootnote28 developed a dataset on the posting behaviors of violent and non-violent RWEs. Derived from this larger research project, the authorsFootnote29 examined the posting behaviors of violent and non-violent RWEs and whether specific behaviors were characteristic of users’ violence status. Davies and colleaguesFootnote30 examined how violent and non-violent RWE identities take shape over time online. Scrivens and colleaguesFootnote31 explored how violent and non-violent RWEs’ time of entry into the lifespan of an extremist online community and posting activity predicted their violent status. Scrivens and colleaguesFootnote32 developed and compared posting typologies among a sample of violent and non-violent RWEs. Finally, ScrivensFootnote33 quantified the existence of extremist ideologies, personal grievances, and violent extremist mobilization efforts expressed by a sample of violent and non-violent RWEs as well as a sample of postings within an extremist online community. Having said that, there is still value in closely examining the online behaviors of violent extremists to inform future risk factor frameworks used by law enforcement and intelligence agencies to identify credible threats online.Footnote34 The current study expands on this researchFootnote35 by examining the online behaviors of violent and non-violent posters during peak posting days and non-peak posting days for comparison purposes, as prior work has not considered the online behaviors of such posters during high-frequency posting days.

Third, it has become increasingly common in online terrorism and extremism research to examine the development or spread of extremist content onlineFootnote36 or the propensity towards violent radicalization online.Footnote37 Recent scholarship in this regard has also considered the impact of trigger or galvanizing events on extremists’ posting patterns such as the effect of riots,Footnote38 rallies,Footnote39 terrorist attacks,Footnote40 insurrections,Footnote41 and the COVID-19 pandemic.Footnote42 The role of political elections as trigger events has also become a growing area in terrorism and extremism work – especially in the empirical research on RWE’s use of the InternetFootnote43 – with the primary focus on the relationship between tweets about the 2016 U.S. presidential election and hate speech on XFootnote44, the growth of alt-right networks on XFootnote45 and 4chanFootnote46 in response to Donald Trump’s election victory, and the impact of various presidential election results on Stormfront posting behaviors.Footnote47 But in light of these contributions, very little is known in online terrorism and extremism research about posting behaviors during days that generate the most user engagement in general. It is common in other disciplines and fields, such as computing science, business, and information management, to examine online behaviors during peak or high traffic posting periods (mostly peak posting hours) to generate knowledge on how various digital platforms are engaged with by users as well as topics of discussion driving the discourse,Footnote48 but online terrorism and extremism research has lagged in this regard.Footnote49

This is an important oversight for several reasons. First, examining the online behaviors of RWEs during peak days would facilitate better understanding of the composition of and patterns comprising these high traffic and popular discussion days – especially if there are differences in the content posted by violent and non-violent users in this regard. Indeed, peak posting days generate the most user engagementFootnote50 as they must take time to write a post to participate in an online community and, in so doing, they are attaching themselves to a particular viewpoint by sharing it online. Further, it is likely that more eyes are viewing the content posted on these high frequency days and thus have a wider audience and reach.Footnote51 Such a high volume of engagement may also increase the potential to influence other viewers, as has been found in key research on social influence,Footnote52 as well as be persuasive in radicalizing individuals to extremist violence.Footnote53 Some users may take advantage of these high traffic days to spread their extremist views. Second, much of the current thinking about the relationship between online posting activity and extremist violence seems to be premised on the untested assumption that more posting equals more violence. While this presumption is intuitively appealing, it is not grounded empirically.Footnote54 Relatedly, practitioners and policymakers continue to struggle with the size and scope of the potential online violent extremist threat.Footnote55 In attempting to further assess the distinction in posting behavior between violent and non-violent users, this research aims to assist practitioners and policymakers in their critical efforts to identify credible threats online.

Current Study

Data and Sample

This study analyzed all open-access content expressed by a sample of violent and non-violent RWEs in a Canadian-themed sub-forum of Stormfront, which is the oldest and largest racial hate site and one of the most influential RWE forums in the world. Footnote56 Stormfront is made up of an array of sub-sections addressing a variety of topics, including an “International” section composed of a range of geographically and linguistically bounded sub-forums (e.g. Stormfront Europe, Stormfront Downunder, and Stormfront Canada). Stormfront has also served as a “funnel site” wherein forum users have been recruited by other RWE users into violent offline groups (e.g. the Blood & Honour, Hammerskins, and various Ku Klux Klan branches).Footnote57 Stormfront currently has approximately 379,000 “members” and contains over 14.4 million posts.

Several emerging digital spaces have been adopted by RWEs in recent years,Footnote58 but Stormfront has been the focus of much research attention since its inception, despite RWEs transitioning to other platforms in recent years. Stormfront continues to be a valuable online space for researchers to assess behavioral posting patterns.Footnote59 To illustrate, researchers have assessed recruitment efforts made by forum users,Footnote60 the formation of a virtual communityFootnote61 and collective identity,Footnote62 the extent to which the site is connected to other racial hate sitesFootnote63, and how discourse on the site is less virulent and more palatable to readers.Footnote64 Most recently, researchers have examined RWE posting behaviors found on the platform,Footnote65 the development of user activity and extremist language thereFootnote66, the impact of presidential election results on Stormfront posting behaviorsFootnote67, and the ways in which the collective identity of the extreme right takes shape over timeFootnote68 and is affected by offline intergroup conflict on the forum.Footnote69 Some exploratory work has also compared the developmental posting behaviors of violent and non-violent users on Stormfront.Footnote70 However, few studies have identified differences in posting behaviors of Stormfront users who share extreme ideological beliefs but are violent or non-violent in the offline worldFootnote71 and even fewer studies have examined online behaviors on posting days that generate the most user engagement. The current study expands on prior research that differentiates the posting behaviors of violent and non-violent RWEs.Footnote72

Data collection and sampling efforts proceeded in four stages. First, a custom-written computer program that was designed to collect vast amounts of information online captured all open-source content on Stormfront Canada,Footnote73 which resulted in approximately 125,000 sub-forum posts made by approximately 7,000 authors between September 12, 2001 and October 12, 2016.Footnote74 Second, to pinpoint users in the sub-forum who were violent or non-violent RWEs offline, a former violent extremistFootnote75 voluntarily reviewed a list of 7,000 users who posted in the sub-forum and selected those who matched one of the two user types.Footnote76 As a result, a total of 49 violent and 50 non-violent RWEs were identified from the list of usernames and their content was then isolated from the collected sub-forum data: 12,617 posts from the violent users and 17,659 posts from the non-violent users.Footnote77 Overall, the sample included approximately 30,276 posts, with the first post made on September 1, 2004 and the last post made on October 12, 2016.

Importantly, Stormfront Canada was selected because it was an online space that the former extremist actively participated in during their involvement in violent right-wing extremism and was intimately familiar with; that is, they were familiar with the users who posted there and could identify individuals who they knew were violent or non-violent RWEs in the offline world. The former was actively involved in the North American RWE movement and a prominent figure there for more than 10 years, both in recruitment and leadership roles – and primarily in violent racist skinhead groups in Canada.

In an effort to verify the authenticity of each user identified by the former extremist, the user identification process was done under the supervision of the lead researcher of this project. Specifically, each time the former identified a user of interest, they were asked to explain in as much detail as possible why the user was identified as a violent or non-violent RWE. The former was also asked to provide detailed examples of the activities that each user engaged in as well as their association with or connection to each identified user. It is worth noting that no attempt was made to have the informant familiar with each of the 7,000 users link them to their usernames, but for those who were identified for the current study, to the best of my knowledge each of their usernames represented a unique user. In particular, the informant reviewed a list of all sub-forum users, and next to each, they included the names of each identified user, thus documenting that each username was distinct and there was no overlap among them – a sampling procedure supported by previous studies.Footnote78 Indeed, this single source may be more familiar with the histories of some of the identified individuals than others. The informant categorized individuals as violent if s/he had personally witnessed the violent activities, and/or had direct first-hand knowledge of said activities.

RWEs who were identified for the current study were actively involved in right-wing extremism. In particular, they – like all extremists – structure their beliefs on the basis that the success and survival of the in-group is inseparable from the negative acts of an out-group and, in turn, they are willing to assume both an offensive and defensive stance in the name of the success and survival of the in-group.Footnote79 RWEs in the current study were therefore characterized as those who subscribed to a racially, ethnically, and/or sexually defined nationalism, which is typically framed in terms of white power and/or white identity (i.e. the in-group) that is grounded in xenophobic and exclusionary understandings of the perceived threats posed by some combination of non-whites, Jews, Muslims, immigrants, refugees, members of the LGBTQ+ community, and feminists (i.e. the out-group(s)).Footnote80 In addition, violent RWEs in the current study committed several acts of known physical violence against a person, including violent attacks against minorities and anti-racist groups. Violence in this regard aligns with Bjørgo and Ravndal’sFootnote81 conceptualization of RWE violence, which they define as “violent attacks whose target selection is based on extreme-right beliefs and corresponding enemy categories—immigrants, minorities, political opponents, or governments [.] [or] spontaneous violence.” On the other hand, non-violent RWEs in the current study did not engage in physical violence against a person in any known capacity but were actively involved in RWE activities offline, including but indeed not limited to rallies, marches, protests, postering and flyering campaigns, and group meetings and gatherings.

Third, “peak posting days” and “non-peak posting days” were identified in the data if they met specific criteria. Peak posting days were operationalized as days comprising the highest frequency of postings which fall in the 99th percentile of the observed data. In other words, all days that included a posting frequency in the top one percent of the observed data were classified as peak posting days (n = 56). describes the distribution of postings in the data and highlights the uniqueness of the peak posting days.

Figure 1. Distribution of postings on Stormfront Canada.

Figure 1. Distribution of postings on Stormfront Canada.

By comparison, non-peak posting days consisted of a sample of 56 days randomly selected from the remaining pool of posting days in the data (i.e. all posting days with posting frequencies that did not fall in the top one percent of the observed data). A comparison of the mean number of posts/day during peak days and non-peak days also show the uniqueness of the peak days (97.107 posts and 25.429 posts, respectively).Footnote82

Fourth, for comparison purposes a sample of postings was randomly selected from the content posted during the 56 peak days and 56 non-peak days by each of the following groups: the violent group, the non-violent group, and the broader group of sub-forum posters (herein referred to as “the comparison group”). Here, 1,000 posts were randomly selected from each of the three groups, with 500 posts/group from the peak days and the remaining 500 posts/group from the non-peak days (n = 3,000 posts).Footnote83 The comparison group was included because research has rarely identified whether the online posting behaviors of violent and non-violent extremists are unique to individuals who post in the same online space generally.Footnote84 Posts that were randomly sampled from the comparison group did not include posts made by those in the violent and non-violent groups. As a result, each user was unique to each sample group, meaning that to the best of my knowledge there were no overlapping users across groups.

Coding Schema

A reasonable starting place for this type of exploratory analysis was to draw from a set of codes used in previous research to examine online indicators of extremism.Footnote85 Here content analysis techniques were used to code the data for RWE ideologies and violent mobilization efforts.Footnote86 One coder analyzed a series of posts from the sample of 3,000 messages, with the unit of analysis being the content of each post, relative to the other indicator categories. Importantly, each observation could have received multiple codes. For example, an online post that contained anti-immigrant and anti-Semitic language was coded for both. To ensure reliability in the coding, an additional coder received systematic training on the inclusion criteria for each code and then analyzed 10% of the sample independently.Footnote87 This resulted in an inter-rater reliability of α = 0.93, which is much higher than the acceptable level of inter-rater agreement (i.e. α = 0.80).Footnote88

Ideology posts were coded (0 = no, 1 = yes) based on the use of comments that detailed or referenced a known RWE ideology. The codebook for expressed ideologies was developed by drawing from the Southern Poverty Law Center’s (SPLC)Footnote89 identified RWE ideologies, which included anti-immigrant, anti-LGBTQ, anti-Muslim, anti-government, Christian Identity, anti-Semitic, male supremacy, and neo-folkish ideologies.Footnote90 In addition, anti-Black ideologies were added to the codebook, as research suggests that Black communities have been primary opponents of the RWE movementFootnote91 and anti-Black ideologies are widely discussed in RWE spaces online.Footnote92 Lastly, conspiratorial posts were added to the codebook, as research indicates that RWE ideologies are oftentimes steeped in conspiracy theories, which frequently involve a grave threat to national sovereignty and/or personal liberty, among other concerns.Footnote93 The codebook for expressed ideologies included 10 binary codes.

Violent extremist mobilization posts (0 = no, 1 = yes) were coded based on language suggesting that posters were preparing to engage in extremist violence and/or made efforts to mobilize others to extremist violence. This codebook was derived from the FBI’s “Homegrown Violent Extremist Mobilization Indicators.”Footnote94 The codebook for violent extremist mobilization indicators included 23 binary codes. For more on the codebook, see the Appendix.

Results

The results are divided into two sections: a comparison of the frequency of ideological posts and violent extremist mobilization posts, both within and across sample groups (i.e. violent, non-violent, and comparison groups) and by type of day (i.e. peak and non-peak posting days). Quotes from the data are presented where appropriate to demonstrate the tenor of postings. All online comments were quoted verbatim.

Expressed Ideologies

provides the frequency of ideological posts expressed by each sample group during peak and non-peak posting days. In general, the non-violent group posted a larger proportion of ideological messages (403 posts) than those observed by the violent and comparison groups during peak posting days (297 posts and 283 posts, respectively). Similarly, during non-peak days the non-violent group posted a larger proportion of ideological messages (726 posts) than the violent and comparison groups (371 posts and 468 posts, respectively). Yet each sample group posted noticeably fewer ideological messages during peak posting days compared to non-peak posting days, as is shown in .

Table 1. Presentation of ideological posts by sample group.

Nonetheless, the type of ideological posts most frequently observed by each sample group was generally similar during peak and non-peak posting days. To illustrate, the most common ideological posts for the non-violent group during peak and non-peak days were anti-government (121 posts and 232 posts, respectively). The following are but a few examples of this ideological content:

Our Politicians sell us out, the experts are bought off, and they all do nothing but lie. Are any of these people held accountable of they are wrong….obviously not, so why listen to any of them. (Non-violent group post, peak posting day)

The government doesn’t want us to protect ourselves. They want a monopoly of weapons and force. Of course, they fail. (Non-violent group post, peak posting day)

The Government polls people all the time. They know what the public wants. Instead of using that information to serve the people, they just use it to help (brainwash) change public opinion to what the government deems is proper. (Non-violent group post, non-peak posting day)

In addition, the most frequently observed ideological posts by the non-violent group during peak and non-peak days were conspiratorial (95 posts and 208 posts, respectively), anti-immigrant (80 posts and 108 posts, respectively), and anti-Semitic (51 posts and 95 posts, respectively), which is reflected in the following messages:

We have nothing to say in our own (formerly white owned) countries anymore. Whatever can be taken is being taken from us. We don’t get half the news that we should know about, but our minds are being dulled with lala lessons on acceptance of everybody who is not like us and of every behavior that has traditionally been revolting to us. We are being herded toward the end of a cliff and we have nothing to hold onto. Will we all jump when they order us to jump? Jump for the love of jewry and non-whites, jump and be gone, ‘cause we stand in the way of alien radicals. (Non-violent group post, peak posting day)

The Jews favor a multicultural environment where they do not stand out as the lone alien group in a potentially hostile and, apart from the Jews, racially homogeneous population. They believe that if they can reduce the White population to minority status and undermine the economic and social foundation of our existence then they will be more secure. […] There is one further benefit to the Jews. A multiracial social environment will produce a mongrelized brown mass of socially atomized individuals without any clear racial or ethnic identity and provide an easy to control working class suitable for Jewish exploitation. (Non-violent group post, non-peak posting day)

By comparison, the most common types of ideological posts for the violent group were also similar during peak and non-peak days, but there were some minor differences observed across type of ideological post. For example, the most frequent ideological post for the violent group during peak days was anti-Semitic (66 posts) and during non-peak days was anti-government (105 posts). In addition, the most frequently observed ideological posts for the violent group during peak days were conspiratorial (61 posts), anti-government (57 posts), and anti-immigrant (43 posts), while the most frequent ideological posts during non-peak days were anti-Semitic (85 posts), conspiratorial (78 posts), and anti-immigrant (48 posts). Regardless of these differences, the most common ideological posts observed by violent users during peak and non-peak days were indeed anti-Semitic, conspiratorial, anti-government, and anti-immigrant. The following are two examples that summarize these discussions:

As I look around me I see so many non whites and hear so many foreign tounges that I want to explode. These people are not the same as you and I (of european heritage). They do not have have the same morals, principles or hygene for that matter. I must admit that I do feel quite defeated alot of the time, this style of government will always classify people such as you and I as the enemy. […] Doesn’t mean I have to like it or succumb to the ZOGs interpretation of what is expected of the their sheep in a multiracial land. (Violent group post, peak posting day)

The Jews who run this country MUST always get their way no matter what. They cannot simply sit there & see non-submisive Goy, openly defying them. The Jews have all the time & money, plus the Supreme Court is hand picked by them. In the end it is some what of a hollow victory for them. Compared to the U.S., the jail time, if any will be small. Our court system here is in some ways, just a make work program for lawyers, as most of the people in government are lawyers themselves. (Violent group post, non-peak posting day)

Lastly, the most frequent types of ideological posts for the comparison group were also similar during peak and non-peak posting days, but some differences were observed in the frequency of ideological posts that were made compared to the non-violent and violent groups. Specifically, the most common ideological posts for the comparison group were anti-government (69 posts), conspiratorial (59 posts), anti-immigrant (54 posts), and anti-Semitic (41 posts) during peak days, while the most common ideological posts during non-peak days were conspiratorial (120 posts), anti-government (113 posts), and anti-immigrant (102 posts). Such discussions tended to reflect the following sentiment:

We are being forced further and further away from our hometowns, being replaced by mass immigration and ridiculous 3rd world breeding. I will be moving 1.5 hours away from my job, just so I can have neighbors who are civilized and live the traditional white lifestyle. […] It sickens me how white people are told to tolerate our own ethnic cleansing, and if you dare speak up, you will lose your job and be labeled a racist supremacist… […] This is no longer the Country I loved (Comparison group post, non-peak posting day)

But unlike the most frequently observed ideological posts by the comparison group during peak days, anti-Black sentiment – which is expressed below – was among the most common ideological post for this group during non-peak days (52 posts): “This place is a cesspool of multiculturalism. The city was built from the sweat and blood of Eastern Europeans. It has beutiful architecture on every street. Today the core is overrun with monkeys as well as their crack whore girlfriends” (Comparison group post, non-peak posting day).

Violent Extremist Mobilization Efforts

Comparing the frequency of violent extremist mobilization posts by sample group revealed several interesting patterns in the data. For example, all sample groups posted noticeably fewer mobilization posts than ideological posts during both peak and non-peak posting days. Nevertheless, and as is shown in , there were interesting variations in the types of mobilization posts observed across sample groups and posting days.

Table 2. Presentation of violent extremist mobilization posts across sample group.

That is, during peak posting days the largest proportion of mobilization posts were observed by the violent group (147 posts) compared to the non-violent and comparison groups (92 posts and 46 posts, respectively). On the other hand, during non-peak days the non-violent group posted a larger proportion of mobilization messages (140 posts) than the violent and comparison groups (79 posts and 105 posts, respectively). Furthermore, the violent group posted noticeably more mobilization messages on peak days than they did on non-peak days, while the non-violent and comparison groups posted noticeably fewer such messages on peak day than they did on non-peak days.

Yet similar to the ideological posts observed by each sample group, the type of mobilization post that each group made were generally comparable during peak and non-peak posting days. In particular, the most common mobilization post expressed by the non-violent group consisted of attempts to radicalize others/pushing others to action, both during peak and non-peak days (52 posts and 96 posts, respectively). Postings in this regard reflected the following sentiment:

I have a fair number of confirmations for attendance and like I said those of you SFers [Stormfront users] who have been waiting to break out of your shells, just do it! Even if you are new we will welcome you so stop hiding [online] and join our struggle for the white race and get involved [offline]! (Non-violent group post, peak posting day)

I know there are more than 3 of us here. […] Perhaps we 3 are the only fearless ones. What’s the hesitation? There is no better time than now. It’s not going to get better and it’s not going to get safer. Words are nothing without action. […] I am a man of action. We can make history or we can talk alot and do nothing and fade away to nothing. :attack (Non-violent group post, non-peak posting day)

Additionally, among the most frequently observed mobilization posts by the non-violent group during peak days were messages seeking to recruit others to mobilize (20 posts) and advocating/encouraging violence (8 posts), and during non-peak days the most frequently observed mobilization posts were also advocating/encouraging violence (24 posts) and seeking to recruit others to mobilize (12 posts). Indeed, the online mobilization efforts of the non-violent group, regardless of the above minor differences, were focused on radicalizing others/pushing others to action, seeking to recruit others to mobilize, and advocating/encouraging violence, with more mobilization posts expressed during non-peak days than peak days.

Notably, the mobilization efforts expressed by the violent group differed from the non-violent and comparison groups. Not only did the violent group post a higher proportion of mobilization messages during peak days than non-peak days (unlike the other sample groups), the most common types of mobilization posts from the violent group differed than that of the non-violent and comparison groups. Similar to the other sample groups, the most common mobilization posts for the violent group during peak and non-peak days involved seeking to recruit others to mobilize (49 posts and 8 posts, respectively) and attempts to radicalize others/pushing others to action (41 posts and 34 posts, respectively). As an example of this discourse:

All I am trying to do is bring the people that wont come out, out. […] All I am talking about is becoming pro-active. […] There are many good people out here as I’m sure you know, We just need to get them all out. […] And dont get me wrong it’s not like there isn’t anything going on…there is a lot of pro active people. I just think there could be more. (Violent group post, peak posting day)

But unlike the most frequently observed mobilization posts expressed by the non-violent and comparison groups, one of the most common mobilization posts from the violent group comprised of icons/flags/prominent figures/symbols/slogans during both peak and non-peak posting days (25 posts and 24 posts, respectively), especially popular RWE slogans. The following are a few examples that best capture these mobilization efforts:

No Brother I would never call one of my own a jew! I know you are no jew. I was refering to the fact that there is a jew sympathizer amongst us and there is nothing we can do about it. Believe it …. Or Not. 88 [Heil Hitler]! (Violent group post, peak posting day)

We should not try and change history we should have an active knowledge of our predecessors and what they have created for us. Today is ours, and tomorrow is the future. RACE OVER ALL! (Violent group post, non-peak posting day)

I haven’t been logged on here in a while. . . . .ran into some trouble. sorry for not replying back […] if anyone of you guys want to get back to me……email me […] We must secure the existance of our race and the future for white children CHEERS. (Violent group post, non-peak posting day)

On the other hand, the mobilization efforts expressed by the comparison group mirrored the mobilization efforts by the non-violent group, with the most frequent mobilization posts during peak and non-peak days consisting of attempts to radicalize others/pushing others to action, both during peak and non-peak posting days (30 posts and 64 posts, respectively), seeking to recruit others to mobilize (6 posts and 18 posts, respectively), and advocating/encouraging violence (5 posts and 15 posts, respectively).

Discussion

This study examined the online posting behaviors of a unique sample of violent and non-violent RWEs within the open-access sections of a sub-forum of Stormfront during peak and non-peak posting days. Several conclusions can be drawn from this exploratory research.

First, a large proportion of ideological posts, especially in comparison to mobilization posts, were observed in the violent, non-violent, and comparison groups during both peak and non-peak posting days. This finding was expected, as previous research has found that StormfrontFootnote95 and other online platforms used by RWEs in generalFootnote96 contain a sizable amount of explicit and overt ideological content. But regardless of whether the content was posted on peak or non-peak days, among the most frequently observed ideological discourse across the sample groups was generally anti-government, conspiratorial, anti-Semitic, and anti-immigrant. Empirical research has similarly found that anti-Semitic and conspiracy discourse are embedded in RWE ideologiesFootnote97 and in much of the RWE rhetoric expressed online, including in RWE forumsFootnote98, social media sites,Footnote99 and fringe platforms.Footnote100 Although some recent work has found that anti-government sentiment makes up a small proportion of postings found on several RWE forumsFootnote101, the findings from the current study align with other empirical work suggesting that RWEs oftentimes endorse anti-government sentiment and contributes to their underlying belief systemFootnote102 – and especially online.Footnote103 Similarly, the extent to which anti-immigrant sentiment is expressed in online RWE spaces compared to other extremist ideologies is mixed in the academic literature, with some work suggesting that it is less apparent in RWE forums than other RWE ideological contentFootnote104 while other work finds that anti-immigrant sentiment is prevalent in RWE content including on XFootnote105, Facebook,Footnote106 Reddit,Footnote107 and RWE forums.Footnote108 The results of the current study align with the latter research.

Second, the non-violent group posted a noticeably larger proportion of ideological content on peak and non-peak posting days than the violent and comparison groups. This finding also comes as little surprise, as research similarly suggests that the online behaviors of non-violent RWEs tend to reflect one of an “ideologue” wherein they post a much larger proportion of ideological content than their violent counterpartFootnote109 as well those posting in the online community in general.Footnote110 These non-violent users likely perceive their role in and engage with the RWE movement as ideologues by providing “conceptual tools” that can be taken up by others involved in RWE violence, which has been reported in empirical research.Footnote111 Interestingly though is that anti-government postings were the most frequently observed ideological messages for this non-violent group both during peak and non-peak days, while the most frequently observed ideological messages for the violent and comparison groups varied by peak and non-peak days. For the violent group, conspiratorial posts were the most common during peak days and anti-government were the most common during non-peak days, while anti-government messages were most common for the comparison group during peak days and conspiratorial were most common during non-peak days. Together, it appears that the non-violent group express themselves as and engage with the RWE movement as political types, as has been found in prior work,Footnote112 while the ideological content posted by the violent and comparison groups may in part be dictated by the topics of conversation in the online community more generally. Regardless, this evidence base remains in its infancy and requires further exploration.

Third, each sample group posted remarkably fewer ideological messages during peak posting days than they did during non-peak days. This finding comes as a surprise because prior work on the online presence of RWEs has overwhelmingly found that peak and high-frequency posting days generate a sizeable amount of RWE content.Footnote113 Perhaps it is the case that users in the current study are of the view that posting an extensive amount of ideological content is unnecessary during peak posting days because user engagement is already high. It may also be the case that the topics of discussion during these peak days draw users away from engaging in extensive discussions about extremist ideologies. Nevertheless, this assumption requires further exploration.

Fourth and perhaps most notably was the extent to which violent extremist mobilization efforts were observed in all three sample groups in general and within and across peak and non-peak posting days in particular. The observed mobilization efforts, although less frequent than ideological posts, suggested that some users were preparing to engage in extremist violence or were making efforts to mobilize others to extremist violence, which has similarly been found in research on the online behaviors of violent and non-violent RWEsFootnote114 as well as in RWE forums known for facilitating violent extremism.Footnote115 Furthermore, across the three sample groups the most frequently observed mobilization posts were similar during peak and non-peak days, with the most common being attempts to radicalize others/pushing others to action as well as recruiting others to mobilize, which too was found in prior research during posting days generally. Footnote116 However, there were important differences in the most frequently observed mobilization efforts made across sample groups. During peak and non-peak posting days, for example, a common mobilization effort made by the non-violent and comparison groups were advocating/encouraging violence, while a common mobilization effort for the violent group was posting messages that contained icons/flags/prominent figures/symbols/slogans – in particular, popular RWE slogans. For the former, research has similarly found that non-violent RWEs post more online messages advocating/encouraging violence than their violent counterpart,Footnote117 perhaps in part because they are less fearful about doing so because they are not engaging in physical violence themselves offline. Interestingly, for the latter, previous research has also found that violent RWEs generally post online using RWE slogans compared to their non-violent counterpartFootnote118 and it is common for such markers to be embedded in the practices of violent RWE groups.Footnote119 This is a posting behavior that analysts who are searching for signs of violent extremists online should narrow in on. Along the same lines, most striking was the extent to which the violent group posted mobilization messages during peak days compared to non-peak days. In other words, unlike the other sample groups, the violent group posted a noticeably larger proportion of mobilization posts on peak days than they did on non-peak days. Perhaps the violent group – who were generally the least active of all sample groups – saw peak posting days as opportune times to take advantage of the high online traffic and make efforts to mobilize others to extremism. Such a finding was a surprise, as previous work has found that violent RWEs tend to post mobilization posts at a much lower rate than their non-violent counterpart.Footnote120 Yet other recent work suggests that the presumed positive association between posting frequency and risk of extremist violence may not be so straightforward.Footnote121 Indeed, this is a question of policy relevance that should be investigated in future research.

While this study offers a first step in assessing the online behaviors of violent and non-violent RWEs in one extremist community during peak and non-peak posting days, there are several limitations that may inform future research. Four points are worth discussing in addition to the validity of the data being based on one key informant.Footnote122 First, analyses were somewhat limited by a small sample size, which restricted the ability to analyze the posting behaviors of an array of violent and non-violent RWEs found online. As a result, this study offers a glimpse into the posting behaviors of violent and non-violent RWEs during peak and non-peak posting days. Second, although an array of extremist ideologies and violent extremist mobilization efforts were captured across sample groups during peak and non-peak posting days, future research should assess the online content that violent and non-violent extremists are posting when they are not discussing ideologies or mobilization efforts, as doing so will provide additional insight into posting differentials or similarities across groups, especially during high-frequency posting day.

Further, and along the same lines, it is likely that certain sub-forum characteristics (e.g. the topics of conversation, users and groups who post in the space, and so on) account for some of the results of the current study. In short, researchers should examine posting behaviors during peak and non-peak posting days across the broader forum as well as in various platform types, such as a comparison of violent RWE forums with generic (non-violent) RWE forums, mainstream social media sites, fringe platforms, and digital applications, as well during various times periods where extremist discourse may have changed. Such comparisons would provide practitioners and policymakers with much needed insight into the extent to which extremist posting behaviors are unique to violent and non-violent posters as well as whether their online behaviors span online spaces that facilitate extremism more generally or whether certain platforms have unique functions for facilitating extremism and associated posting compositions. This could be done in combination with a mixed methods approach to identify key themes that emerge in the peak and non-peak posting days, or in combination with linguistic analysis tools.

Lastly, unlike the growing body of literature that has identified differences in the offline activities and behaviors of violent and non-violent extremists,Footnote123 data for the current study does not include information on key characteristics identified in this research such as an individual’s employment status, criminal records, history of mental illness, and extremist/radicalized peers, among many others. As a result, there remain many unanswered questions about the characteristics of those identified as violent or non-violent RWEs in the current study. Research is therefore needed to assess whether the above characteristics and others mirror the sample of violent or non-violent extremists as well as whether certain characteristics drive posting behaviors. This could be done similar to the sampling procedure used for the current study, wherein a former extremist would identify violent and non-violent users, but sub-categories could be developed to capture the abovementioned key characteristics of each identified extremist as well as to develop a scheme to further identify differences in posting groups. This may include violent and non-violent criminal behaviors, such as threat of violence, physical damage to property, and so on. Former extremists who identify violent and non-violent users in data as part of a research project should also be asked questions regarding the logic of a user’s posting behavior, such as whether violent and non-violent users are cognizant of their online presence and if so, what are the posting patterns they display in particular situations (e.g. following an offline acts of physical violence, a user may post few messages for a period of time in fear of drawing attention to them from law enforcement)? Future research should also identify exact moments in time that individuals engaged in offline violence and collect information on whether their attack was planned or unplanned, the victims or targets and the motive of the attacks, and then assess users’ posting behaviors, among other things, both before and after the act of violence. Such an analysis may provide practitioners and policymakers with the much-needed insight into whether specific online posting patterns escalate to physical violence as well as assist in developing methods to identify credible threats online prior to their engagement in violence offline.

Acknowledgement

Thank you to Tiana Gaudette for her assistance in coding data for the current study. Thanks also to Steven Chermak and Wally Wojciechowski for their valuable feedback on earlier versions of this study and to Richard Frank for his data collection efforts on the project. Lastly, thanks to Maura Conway, Garth Davies, Thomas Holt, Matteo Vergani, Stephane Baele, Isabelle van der Vegt, and Ayşe Lokman for their conceptual feedback on the study.

Disclosure Statement

No potential conflict of interest was reported by the author.

Notes

1 Garth Davies, Ryan Scrivens, Tiana Gaudette, and Richard Frank, “A Longitudinal Comparison of Violent and Non-Violent Right-Wing Extremist Identities Online,” in Right-Wing Extremism in Canada and the United States, ed. Barbara Perry, Jeff Gruenewald, and Ryan Scrivens (Cham, Switzerland: Palgrave, 2022), 255–78; Ryan Scrivens, Garth Davies, Tiana Gaudette, and Richard Frank, “Comparing Online Posting Typologies among Violent and Nonviolent Right-Wing Extremists.” Ahead of Print; Ryan Scrivens, Thomas W. Wojciechowski, Joshua D. Freilich, Steven M. Chermak, and Richard Frank, “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists.” Terrorism and Political Violence 35, no. 1 (2023): 192–209; Ryan Scrivens, Thomas W. Wojciechowski, Joshua D. Freilich, Steven M. Chermak, and Richard Frank, “Differentiating Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists,” Criminal Justice Policy Review 35, no. 9(2022): 943–65; Ryan Scrivens, “Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing Extremists,” Terrorism and Political Violence 35, no. 6 (2023): 1389–409; Wojciechowski, Ryan Scrivens, Joshua D. Freilich, Steven M. Chermak, and Richard Frank, “Testing a Probabilistic Model of Desistance from Online Posting Behavior in a Right-Wing Extremist Forum.” International Journal of Comparative and Applied Criminal Justice. Ahead of Print.

2 See Maura Conway, “Determining the Role of the Internet in Violent Extremism and Terrorism: Six Suggestions for Progressing Research,” Studies in Conflict & Terrorism 40, no. 1 (2017): 77–98.

3 See Ryan Scrivens, Paul Gill, and Maura Conway, “The Role of the Internet in Facilitating Violent Extremism and Terrorism: Suggestions for Progressing Research,” in The Palgrave Handbook of International Cybercrime and Cyberdeviance, ed. Thomas J. Holt and Adam Bossler (London, UK: Palgrave, 2020), 1–22.

4 Joel Brynielsson, Andreas Horndahl, Fredik Johansson, Lisa Kaati, Christian Mårtenson, and Pontus Svenson, “Analysis of Weak Signals for Detecting Lone Wolf Terrorists,” Security Informatics 2, no. 11 (2013): 1–15; Katie Cohen, Fredik Johansson, Lisa Kaati, and Jonas C. Mork, “Detecting Linguistic Markers for Radical Violence in Social Media,” Terrorism and Political Violence 26, no. 1 (2014): 246–56; Lisa Kaati, Amendra Shrestha, and Katie Cohen, “Linguistic Analysis of Lone Offender Manifestos,” Proceedings of the 2016 IEEE International Conference on Cybercrime and Computer Forensics, Vancouver, BC, Canada.

5 Paul Gill and Emily Corner, “Lone-Actor Terrorist Use of the Internet and Behavioural Correlates,” in Lee Jarvis, Stuart Macdonald, and Thomas M. Chen, Eds., Terrorism Online: Politics, Law, Technology and Unconventional Violence (London, UK: Routledge, 2015), pp. 35-53.

6 Paul Gill, Emily Corner, Maura Conway, Amy Thornton, Mia Bloom, and John Horgan, “Terrorist Use of the Internet by the Numbers: Quantifying Behaviors, Patterns, and Processes,” Criminology and Public Policy 16, no. 1 (2017): 99–117.

7 Donald Holbrook and Max Taylor, “Terrorism as Process Narratives: A Study of Pre-Arrest Media Usage and the Emergence of Pathways to Engagement,” Terrorism and Political Violence 31, no. 6 (2019): 1307-1326.

8 Tiana Gaudette, Ryan Scrivens, and Vivek Venkatesh, “The Role of the Internet in Facilitating Violent Extremism: Insights from Former Right-Wing Extremists,” Terrorism and Political Violence 34, no. 7 (2022): 1339–56.

9 Ryan Scrivens, Tiana Gaudette, Maura Conway, and Thomas J. Holt, “Right-Wing Extremists’ Use of the Internet: Trends in the Empirical Literature,” in Right-Wing Extremism in Canada and the United States, ed. Barbara Perry, Jeff Gruenewald, and Ryan Scrivens (Cham, Switzerland: Palgrave, 2022), 355–80.

10 See Gaudette et al., “The Role of the Internet in Facilitating Violent Extremism”; see also Conway, “Determining the Role of the Internet in Violent Extremism and Terrorism”; Gill et al., “Terrorist Use of the Internet by the Numbers.”

11 Maura Conway, Ryan Scrivens and Logan Macnair, “Right-Wing Extremists’ Persistent Online Presence: History and Contemporary Trends,” The International Centre for Counter-Terrorism – The Hague 10(2019): 1–24; Thomas J. Holt, Joshua D. Freilich, and Steven M. Chermak, “Examining the Online Expression of Ideology Among Far-Right Extremist Forum Users,” Terrorism and Political Violence 34, no. 2 (2022): 364–84; Scrivens et al., “Right-Wing Extremists’ Use of the Internet.”

12 See, for example, Southern Poverty Law Center, “White Homicide Worldwide,” 1 April 2014. https://www.splcenter.org/20140331/white-homicide-worldwide (accessed 17 January 2024).

13 See Conway et al., “Right-Wing Extremists’ Persistent Online Presence.”

14 Ben Goggin and Kalhan Rosenblatt, “Buffalo Shooting Suspect Appeared to be Active in Online Gun Communities,” NBC News, 15 May 2022. https://www.nbcnews.com/tech/tech-news/buffalo-shooting-peyton-gendron-live-stream-gun-manifesto-suspect-rcna28911 (accessed 17 January 2024).

15 Les Back, “Aryans Reading Adorno: Cyber-Culture and Twenty-First Century Racism,” Ethnic and Racial Studies 25, no. 4 (2002): 628–51; Ana-Maria Bliuc, John Betts, Matteo Vergani, Muhammad Iqbal, and Kevin Dunn, “Collective Identity Changes in Far-Right Online Communities: The Role of Offline Intergroup Conflict,” New Media and Society 21, no. 8 (2019):1770–86; Val Burris, Emery Smith, E., and Ann Strahm, “White Supremacist Networks on the Internet,” Sociological Focus 33, no. 2 (2000): 215–35; Willem De Koster, and Dick Houtman, “‘Stormfront is Like a Second Home to Me,’” Information, Communication and Society 11, no. 8 (2008): 1155-1176; Robert Futrell and Pete Simi, “Free Spaces, Collective Identity, and the Persistence of U.S. White Power Activism,” Social Problems 51, no 1. (2004): 16–42; Holt et al., “Examining the Online Expression of Ideology Among Far-Right Extremist Forum Users”; Ryan Scrivens, Garth Davies, and Richard Frank, “Measuring the Evolution of Radical Right-Wing Posting Behaviors Online,” Deviant Behavior 41, no. 2 (2020): 216–32; Ryan Scrivens, “Exploring Radical Right-Wing Posting Behaviors Online,” Deviant Behavior 42, no. 11(2021):1470–84; Magdalena Wojcieszak, “‘Don’t Talk to Me’: Effects of Ideological Homogenous Online Groups and Politically Dissimilar Offline Ties on Extremism,” New Media and Society 12, no. 4 (2010): 637–55.

16 Mattias Ekman, “Anti-Refugee Mobilization in Social Media: The Case of Soldiers of Odin,” Social Media + Society 4, no. 1 (2018): 1–11; Jade Hutchinson, Amarnath Amarasingam, Ryan Scrivens, and Brian Ballsun-Stanton, “Mobilizing Extremism Online: Comparing Australian and Canadian Right-Wing Extremist Groups on Facebook,” Behavioral Sciences of Terrorism and Political Aggression 15, no. 2 (2023): 215–45; Lella Nouri and Nuria Lorenzo-Dus, “Investigating Reclaim Australia and Britain First’s Use of Social Media: Developing a New Model of Imagined Political Communities Online,” Journal for Deradicalization 18 (2019): 1–37; Ryan Scrivens and Amarnath Amarasingam, “Haters Gonna “Like”: Exploring Canadian Far-Right Extremism on Facebook,” in Digital Extremisms: Readings in Violence, Radicalisation and Extremism in the Online Space, ed. Mark Littler and Benjamin Lee (London: Palgrave 2020), 63–89; Sebastian Stier, Lisa Posch, Arnim Bleier, and Markus Strohmaier, “When Populists Become Popular: Comparing Facebook Use by the Right-Wing Movement Pegida and German Political Parties,” Information, Communication and Society 20, no. 9 (2017): 1365–88.

17 J. M. Berger, Nazis vs. ISIS on Twitter: A Comparative Study of White Nationalist and ISIS Online Social Media Networks (Washington, DC: The George Washington University Program on Extremism, 2016); J. M. Berger and Bill Strathearn, Who Matters Online: Measuring Influence, Evaluating Content and Countering Violent Extremism in Online Social Networks (London, UK: The International Centre for the Study of Radicalisation and Political Violence, 2013); Pete Burnap and Matthew L. Williams, “Cyber Hate Speech on Twitter: An Application of Machine Classification and Statistical Modeling for Policy and Decision,” Policy and Internet 7, no. 2 (2015): 223–42; Roderick Graham, “Inter-Ideological Mingling: White Extremist Ideology Entering the Mainstream on Twitter,” Sociological Spectrum 36, no. 1 (2016): 24–36.

18 Mattias Ekman, “The Dark Side of Online Activism: Swedish Right-Wing Extremist Video Activism on YouTube,” MedieKultur: Journal of Media and Communication Research 30, no 56 (2014): 21–34; Derek O’Callaghan, Derek Greene, Maura Conway, Joe Carthy, and Pádraig Cunningham, “Down the (White) Rabbit Hole: The Extreme Right and Online Recommender Systems,” Social Science Computer Review 33, no. 4 (2014): 1–20.

19 Savvas Finkelstein, Joel Zannettou, Barry Bradlyn, and Jeremy Blackburn, “A Quantitative Approach to Understanding Online Antisemitism,” arXiv:1809.01644, 2018; Antonis Papasavva, Savvas Zannettou, Elimiano De Cristofaro, Gianluca Stringhini, and Jeremy Blackburn, “Raiders of the Lost Kek: 3.5 Years of Augmented 4chan Posts from the Politically Incorrect Board,” arXiv:2001.07487, 2020.

20 Savvas Zannettou, Barry Bradlyn, Elimiano De Cristofaro, Haewoon Kwak, Michael Sirivianos, Gianluca Stringini, and Jeremy Blackburn, “What is Gab: A Bastion of Free Speech or an Alt-Right Echo Chamber,” Proceedings of the WWW ‘18: Companion Proceedings of The Web Conference 2018’, Lyon, Fance; Yuchen Zhou, Mark Dredze, David A. Broniatowski, and William D. Adler, “Elites and Foreign Actors Among the Alt-Right: The Gab Social Media Platform,” First Monday 24, no. 9 (2019).

21 Gabriel Weimann and Natalie Masri, “Research Note: Spreading Hate on TikTok,” Studies in Conflict & Terrorism 46, no. 5(2023): 752–65.

22 Jakob Guhl and Jacob Davey, A Safe Space to Hate: White Supremacist Mobilisation on Telegram (London, UK: Institute for Strategic Dialogue, 2020); Aleksandra Urman and Stefan Katz, “What They Do in the Shadows: Examining the Far-Right Networks on Telegram”, Information, Communication & Society 25, no. 7 (2022): 904–23.

23 See Garrison Wells, Agnes Romhanyi, Jason G. Reitman, Reginald Gardner, Kurt Squire, and Constance Steinkuehler, “Right-Wing Extremism in Mainstream Games: A Review of the Literature,” Games and Culture. Ahead of Print.

24 Michael H. Becker, M. “When Extremists Become Violent: Examining the Association Between Social Control, Social Learning, and Engagement in Violent Extremism,” Studies in Conflict & Terrorism 44, no. 12 (2021): 1104–24; Steven Chermak, Joshua Freilich, and Michael Suttmoeller, “The Organizational Dynamics of Far-Right Hate Groups in the United States: Comparing Violent to Nonviolent Organizations,” Studies in Conflict & Terrorism 36, no. 3 (2013): 193–218; Joshua D. Freilich and Gary LaFree, “Criminology Theory and Terrorism: Introduction to the Special Issue,” Terrorism and Political Violence 27, no. 1 (2015): 1–15; Joshua D. Freilich, Steven M. Chermak, and Jeff Gruenewald, “The Future of Terrorism Research: A Review Essay,” International Journal of Comparative and Applied Criminal Justice 39, no. 4 (2015): 353–69; John Horgan, Neil Shortland, Suzzette Abbasciano, Shaun Walsh, “Actions Speak Louder Than Words: A Behavioral Analysis of 183 Individuals Convicted for Terrorist Offenses in the United States from 1995 to 2012,” Journal of Forensic Sciences 61, no. 5 (2016): 1228–37; Katarzyna Jasko, Gary LaFree, and Arie Kruglanski, “Quest for Significance and Violent Extremism: The Case of Domestic Radicalization,” Political Psychology 38, no. 5 (2017): 815–31; Sarah Knight, David Keatley, and Katie Woodward, “Comparing the Different Behavioral Outcomes of Extremism: A Comparison of Violent and Non-Violent Extremists, Acting Alone or as Part of a Group,” Studies in Conflict & Terrorism 45, no. 8 (2022): 682–703; Gary LaFree, Michael A. Jensen, Patrick A. James, and Aaron Safer-Lichtenstein, “Correlates of Violent Political Extremism in the United States,” Criminology 56, no. 2 (2018): 233–68.

25 Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists.”

26 Thomas J. Holt, Joshua D. Freilich, Steven M. Chermak, and Gary LaFree, “Examining the Utility of Social Control and Social learning in the Radicalization of Violent and Nonviolent Extremists,” Dynamics of Asymmetric Conflict 11, no. 3 (2018): 125–48.

27 Michael Wolfowicz, Simon Perry, Badi Hasisi, and David Weisburd, “Faces of Radicalism: Differentiating Between Violent and Non-Violent Radicals by Their Social Media Profiles,” Computers in Human Behavior. Ahead of Print.

28 Ryan Scrivens, Steven M. Chermak, Joshua D. Freilich, Thomas W. Wojciechowski, and Richard Frank, Detecting Extremists Online: Examining Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists (Washington, DC: RESOLVE Network Policy Note, 2021).

29 Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists.”

30 Davies et al., “A Longitudinal Comparison of Violent and Non-Violent Right-Wing Extremist Identities Online.”

31 Scrivens et al., “Differentiating Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists.”

32 Scrivens et al., “Comparing Online Posting Typologies among Violent and Nonviolent Right-Wing Extremists.”

33 Scrivens, “Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing Extremists.”

34 Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists.”

35 Davies et al., “A Longitudinal Comparison of Violent and Non-Violent Right-Wing Extremist Identities Online”; Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists”; Scrivens et al., “Differentiating Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists”; Scrivens et al., “Comparing Online Posting Typologies among Violent and Nonviolent Right-Wing Extremists”; Scrivens, “Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing Extremists.”

36 Garth Davies, Edith Wu, and Richard Frank, “A Witch’s Brew of Grievances: The Potential Effects of COVID-19 on Radicalization to Violent Extremism,” Studies in Conflict & Terrorism 46, no. 11(2023): 2327–50; Leo Figea, Lisa Kaati, and Ryan Scrivens, “Measuring Online Affects in a White Supremacy Forum,” Proceedings of the 2016 IEEE International Conference on Intelligence and Security Informatics, Tucson, Arizona, USA; Philippa Levey and Martin Bouchard, “The Emergence of Violent Narratives in the Life-Course Trajectories of Online Forum Participants,” Journal of Qualitative Criminal Justice and Criminology 7(2019): 95–121; Logan Macnair and Richard Frank, “Changes and Stabilities in the Language of Islamic State Magazines: A Sentiment Analysis,” Dynamics of Asymmetric Conflict 11(2018): 109–20; Andrew J. Park, Brian Beck, Darrick Fletche, Patrick Lam, and Herbert H. Tsang, “Temporal Analysis of Radical Dark Web Forum Users,” Proceedings of the 2016 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining, San Francisco, CA, USA; Matteo Vergani and Ana-Maria Bluic, “The Evolution of the ISIS’ Language: A Quantitative Analysis of the Language of the First Year of Dabiq Magazine,” Sicurezza, Terrorismo e Società 2(2015): 7–20; Scrivens et al., “Measuring the Evolution of Radical Right-Wing Posting Behaviors Online.”.

37 Swati Agarwal and Ashish Sureka, “Using KNN and SVM Based One-Class Classifier for Detecting Online Radicalization on Twitter,” Proceedings of the International Conference on Distributed Computing and Internet Technology, Bhubaneswar, India; Adam Bermingham, Maura Conway, Lisa McInerney, Neil O’Hare, and Alan F. Smeaton, “Combining Social Network Analysis and Sentiment Analysis to Explore the Potential for Online Radicalisation,” Proceedings of the 2009 International Conference on Advances in Social Network Analysis Mining, Athens, Greece; Hsinchun Chen, “Sentiment and Affect Analysis of Dark Web Forums: Measuring Radicalization on the Internet,” Proceedings of the 2008 IEEE International Conference on Intelligence and Security Informatics, Taipei, Taiwan; Emilio Ferrara, “Contagion Dynamics of Extremist Propaganda in Social Networks.” Information Sciences,” Information Sciences 418–419(2017): 1–12; Emilio Ferrara, Wen-Qiang Wang, Onur Varol, Alessandro Flammini, and Aram Galstyan, “Predicting Online Extremism, Content Adopters, and Interaction Reciprocity,” Proceedings of the International Conference on Social Informatics, Berlin, Germany; Ted Grover and Gloria Mark. “Detecting Potential Warning Behaviors of Ideological Radicalization in an Alt-Right Subreddit,” Proceedings of the Thirteenth International AAAI Conference on Web and Social Media, Munich, Germany, 2019; Benjamin W. K. Hung, Anura P. Jayasumana, and Vidarshana W. Bandara. “Detecting Radicalization Trajectories Using Graph Pattern Matching Algorithms,” Proceedings of the 2016 IEEE International Conference on Intelligence and Security Informatics, Tucson, Arizona, USA; Benjamin W. K. Hung, Anura P. Jayasumana, and Vidarshana W. Bandara, “Pattern Matching Trajectories for Investigative Graph Searches,” Proceedings of the 2016 IEEE International Conference on Data Science and Advanced Analytics, Montreal, Canada.

38 Bliuc et al., “Collective Identity Changes in Far-Right Online Communities.”

39 Isabelle van der Vegt, Maximilian Mozes, Paul Gill, and Bennett Kleinberg, “Online Influence, Offline Violence: Language use on YouTube Surrounding the ‘Unite the Right’ Rally,” Journal of Computational Social Science 4(2021): 333–54.

40 Burnap and Williams, “Cyber Hate Speech on Twitter”; Markus Kaakinen, Atte Oksanen and Pekka Räsänen, “Did the Risk of Exposure to Online Hate Increase After the November 2015 Paris Attacks? A Group Relations Approach,” Computers in Human Behavior 78 (2018): 90–7; Matthew L. Williams and Pete Burnap, “Cyberhate on Social Media in the Aftermath of Woolwich: A Case Study in Computational Criminology and Big Data,” British Journal of Criminology 56, no. 2 (2015): 211–38.

41 Bethany Leap and Michael H. Becker, “The Not-So-Silent “Majority”: An Automated Content Analysis of Anti-Government Online Communities,” Perspectives on Terrorism 17, no. 1 (2023):103–122.

42 Stephen M. Croucher, Thao Nguyen, and Diyako Rahmani, “Prejudice Toward Asian Americans in the COVID-19 Pandemic: The Effects of Social Media use in the United States,” Frontiers in Communication 5 (2020); Matteo Vergani, Alfonso Arranz, Ryan Scrivens, and Liliana Orellana, “Hate Speech in a Telegram Conspiracy Channel During the First Year of the COVID-19 Pandemic,” Social Media + Society 8, no. 4(2022): 1–14.

43 See Scrivens et al., “Right-Wing Extremists’ Use of the Internet.”

44 Karsten Müller and Carlo Schwarz, “From Hashtag to Hate Crime: Twitter and Anti-Minority Sentiment,” available at SSRN: http://dx.doi.org/10.2139/ssrn.3149103; Alexandra A. Siegel, Evgenii Nikitin, Pablo Barberá, Joanna Sterling, Bethany Pullen, Richard Bonneau, Jonathan Nagler and Joshua A. Tucker, “Trumping Hate on Twitter? Online Hate Speech in the 2016 U.S. Election Campaign and its Aftermath,” Quarterly Journal of Political Science 16, no. 1 (2021): 71–104.

45 J. M. Berger, The Alt-Right Twitter Census: Defining and Describing the Audience for Alt-Right Content on Twitter (Dublin, Ireland: VOX-Pol Network of Excellence, 2018); Bharath Ganesh, “Weaponizing White Thymos: Flows of Rage in the Online Audiences of the Alt-Right,” Cultural Studies 34, no. 6(2020): 892–924.

46 Papasavva et al. “Raiders of the Lost Kek;” Zannettou et al. “A Quantitative Approach to Understanding Online Antisemitism”.

47 Ryan Scrivens, George W. Burruss, Thomas J. Holt, Steven M. Chermak, Joshua D. Freilich, and Richard Frank, “Triggered by Defeat or Victory? Assessing the Impact of Presidential Election Results on Extreme Right-Wing Mobilization Online,” Deviant Behavior 42, no. 5 (2021): 630–45.

48 But a few examples include: Irena Pletikosa Cvijikj and Florian Michahelles, “Online Engagement Factors on Facebook Brand Pages,” Social Network Analysis and Mining 3 (2013): 843–61; Lei Guo, Enhua Tan, Songqing Chen, Xiaodong Zhang, and Yihong (Eric) Zhao, “Analyzing Patterns of User Content Generation in Online Social Networks,” in Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining; Mike Thelwall, Kevan Buckley, and Georgios Paltoglou, “Sentiment in Twitter Events,” Journal of the American Society for Information Science and Technology 62, no. 2 (2011): 406–18; Qiu Fang Ying, Dah Ming Chiu, Srinivasan Venkatramanan, and Xiaopeng Zhang, “User Modeling and Usage Profiling Based on Temporal Posting Behavior in OSNs,” Online Social Networks and Media 8 (2018): 32–41.

49 Three notable exceptions include: Hutchinson, et al. “Mobilizing Extremism Online”; Ayse Deniz Lokmanoglu, Carol K. Winkler, Monerah Al Mahmoud, Kayla McMinimy, and Katherine Kountz, “Textual Messaging of ISIS’s al-Naba and the Context Drivers that Correspond to Strategic Changes,” Studies in Conflict & Terrorism. Ahead of Print; Scrivens and Amarasingam, “Haters Gonna “Like.”.

50 Hutchinson, et al. “Mobilizing Extremism Online”; Scrivens and Amarasingam, “Haters Gonna “Like.”.

51 Scrivens and Amarasingam, “Haters Gonna “Like.”.

52 Brian Butler, “Membership Size, Communication Activity, and Sustainability: A Resource-Based Model of Online Social Structures,” Information Systems Research 12, no. 4(2021): 346–62; Gabriel Weimann, The Influentials: People Who Influence People (Albany: State University of New York Press, 1994).

53 Gaudette et al. “The Role of the Internet in Facilitating Violent Extremism”; Lance Y. Hunter, Glen Biglaiser, Ronald J. McGauvran, and Leann Collins, “The Effects of Social Media on Domestic Terrorism,” Behavioral Sciences of Terrorism and Political Aggression. Ahead of Print.

54 See Scrivens et al., “Comparing Online Posting Typologies among Violent and Nonviolent Right-Wing Extremists.”

55 Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists.”

56 Bliuc et al., “Collective Identity Changes in Far-Right Online Communities”; Pete Simi and Robert Futrell, American Swastika: Inside the White Power Movement’s Hidden Spaces of Hate (Second Edition) (Lanham, MD: Rowman and Littlefield Publishers, 2015).

57 Bradley Galloway and Ryan Scrivens, “The Hidden Face of Hate Groups Online: An Insider’s Perspective,” VOX-Pol Network of Excellence Blog, January 3, 2018. https://www.voxpol.eu/hidden-face-hate-groups-online-formers-perspective (accessed 17 January 2024).

58 See Conway et al., “Right-Wing Extremists’ Persistent Online Presence.”

59 See Scrivens et al., “Right-Wing Extremists’ Use of the Internet.”

60 See W. Chris Hale, “Extremism on the World Wide Web: A Research Review,” Criminal Justice Studies 25, no. 4 (2010): 343–56; see also Christopher J. Lennings, Krestina L. Amon, Heidi Brummert, and Nicholas J. Lennings, “Grooming for Terror: The Internet and Young People,” Psychiatry, Psychology and Law 17, no. 3 (2010): 424–37; Meghan Wong, Richard Frank, and Russell Allsup, “The Supremacy of Online White Supremacists: An Analysis of Online Discussions of White Supremacists,” Information and Communications Technology Law 24, no. 1 (2015): 41–73.

61 See Back, “Aryans Reading Adorno”; see also Lorraine Bowman-Grieve, “Exploring “Stormfront:” A Virtual Community of the Radical Right,” Studies in Conflict and Terrorism 32, no. 11 (2009): 989-1007; see also Willem De Koster and Dick Houtman, “‘Stormfront is Like a Second Home to Me:’ On Virtual Community Formation by Right-Wing Extremists,” Information, Communication and Society 11, no. 8 (2008): 1155–1176.

62 See Futrell and Simi, “Free Spaces, Collective Identity, and the Persistence of U.S. White Power Activism;” see also Barbara Perry and Ryan Scrivens, “White Pride Worldwide: Constructing Global Identities Online,” in The Globalisation of Hate: Internationalising Hate Crime, ed. Jennifer Schweppe and Mark Walters (New York, NY: Oxford University Press, 2016), 65–78.

63 See Burris et al., “White Supremacist Networks on the Internet;” see also Phyllis B. Gerstenfeld, Diana R. Grant, and Chau-Pu Chiang, “Hate Online: A Content Analysis of Extremist Internet Sites,” Analysis of Social Issues and Public Policy 3, no. 1 (2003): 29–44.

64 See Daniels, Cyber Racism: White Supremacy Online and the New Attack on Civil Rights (Lanham, MA: Rowman and Littlefield Publishers, 2009); see also Priscilla M. Meddaugh and Jack Kay, “Hate Speech or ‘Reasonable Racism?’ The Other in Stormfront,” Journal of Mass Media Ethics 24, no. 4 (2009): 251–68.

65 Scrivens, “Exploring Radical Right-Wing Posting Behaviors Online.”.

66 Bennett Kleinberg, Isabelle van der Vegt, and Paul Gill, “The Temporal Evolution of a Far‑Right Forum,” Journal of Computational Social Science 4(2021): 1–23.

67 Scrivens et al., “Triggered by Defeat or Victory?”.

68 Scrivens et al., “Measuring the Evolution of Radical Right-Wing Posting Behaviors Online.”

69 Bliuc et al., “Collective Identity Changes in Far-Right Online Communities.”

70 Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists.”

71 Notable exceptions include Davies et al., “A Longitudinal Comparison of Violent and Non-Violent Right-Wing Extremist Identities Online”; Scrivens, “Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing Extremists; Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists.” Scrivens et al., “Differentiating Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists.”

72 Davies et al., “A Longitudinal Comparison of Violent and Non-Violent Right-Wing Extremist Identities Online”; Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists”; Scrivens et al., “Differentiating Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists”; Scrivens, “Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing Extremists.”

73 For more information on the web-crawler, see Ryan Scrivens, Tiana Gaudette, Garth Davies, and Richard Frank, “Searching for Extremist Content Online Using The Dark Crawler and Sentiment Analysis,” in Methods of Criminology and Criminal Justice Research, ed. Mathieu Deflem and Derek M. D. Silva (Bingley, UK: Emerald Publishing, 2019), 179–94.

74 September 12, 2001 was simply date that the sub-forum went live online. Based on our assessment of the first messages posted on the sub-forum, it would appear as though Stormfront Canada was not launched in response to the 9/11 terror attacks.

75 Former violent extremists are individuals who, at one time in their lives, subscribed to and/or perpetuated violence in the name of a particular extremist ideology and have since publicly and/or privately denounced violence in the name of a particular extremist ideology. In short, they no longer identify themselves as adherents of a particular extremist ideology or are affiliated with an extremist group or movement.

76 Data collection efforts followed the proper ethical procedures for conducting research involving human participants. Here the former extremist was informed that their participation in the study was entirely voluntary. They were also informed that they had the right to decline to answer questions or to end the interview/withdraw from the study at any time. In addition, the former was informed that they would not be identified by name in any publication, and that all data collected from the interview would be de-identified for the purpose of ensuring participant anonymity. One in-person interview was conducted with the former in June 2017 and was approximately 10 hours in length. The interview was audio-recorded and transcribed.

77 This study was not an indictment of this sub-forum itself. The sub-forum was selected because it was an online space that the former extremist actively participated in during his involvement in violent RWE, meaning that they were familiar with the users who posted there and could identify individuals who the former knew were violent or non-violent RWEs in the offline world.

78 See Davies et al., “A Longitudinal Comparison of Violent and Non-Violent Right-Wing Extremist Identities Online”; see also Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists.”

79 See J. M. Berger, Extremism (Cambridge, MA: The MIT Press, 2018).

80 See Conway et al., “Right-Wing Extremists’ Persistent Online Presence.”

81 Tore Bjørgo and Jacob Aasland Ravndal, “Extreme-Right Violence and Terrorism: Concepts, Patterns, and Responses,” The International Centre for Counter-Terrorism – The Hague 10(2019): p. 5.

82 The mean number of posts/day in the data was 22.320.

83 I acknowledge the selection bias in this regard, as the pool of messages for peak days was larger than those for non-peak days.

84 Scrivens, “Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing Extremists.”

85 Scrivens, “Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing Extremists”; Ryan Scrivens, Amanda Isabel Osuna, Steven M. Chermak, Michael A. Whitney, and Richard Frank, “Examining Online Indicators of Extremism in Violent Right-Wing Extremist Forums,” Studies in Conflict & Terrorism 46, no. 11(2023):2149–73.

86 The codebooks used by Scrivens in “Examining Online Indicators of Extremism among Violent and Non-Violent Right-Wing Extremists” as well as Scrivens in “Examining Online Indicators of Extremism in Violent Right-Wing Extremist Forums” included measures to assess personal grievances in users’ online postings. These measures were initially included in the current study but were ultimately excluded after the data were coded as very few messages were identified that included personal grievances – a finding that mirrors the abovementioned studies.

87 A 10% sub-sample is commonly used by terrorism and extremism researchers when assessing inter-rater reliability. A recent example includes Steven Windisch, Michael K. Logan, and Gina Scott Ligon, “Headhunting Among Extremist Organizations: An Empirical Assessment of Talent Spotting,” Perspectives on Terrorism 12, no. 2 (2018): 44–62.

88 Patrick E. Shrout and Joseph L. Fleiss, “Intraclass Correlations: Uses in Assessing Rater Reliability,” Psychological Bulletin 86, no. 2 (1979): 420–8.

89 Southern Poverty Law Center, “Ideologies,” 19 November 2020, https://www.splcenter.org/fighting-hate/extremist-files/ideology (accessed 17 January 2024).

90 A select number of ideology codes were drawn from the SPLC ideologies list to make up the ideology codebook for the current study – ideologies that applied to the data or did not overlap with other ideology categories outlined by the SPLC. For example, SPLC’s category ‘neo-Nazi’, ‘racist skinhead’, and ‘Ku Klux Klan’ were omitted from the current study as their description were too similar to code in a meaningful way. Other ideologies from SPLC’s list that were omitted from the codebook included Black separatist, Phineas Priesthood, hate music, and alt-right, given that they did not adequately represent the data under investigation.

91 Barbara Perry and Randy Blazak, “Places for Races: The White Supremacist Movement Imagines US Geography,” Journal of Hate Studies 8, no. 1 (2009): 29–51; Pete Simi, “Why Study White Supremacist Terror? A Research Note,” Deviant Behavior 31, no. 3 (2010): 251–73.

92 Lorraine Bowman-Grieve, “‘Exploring ‘Stormfront’: A Virtual Community of the Radical Right,” Studies in Conflict & Terrorism 32, no. 11 (2009): 989–1007; Daniels, Cyber Racism.

93 Michael Barkun, “Millenarian Aspects of ‘White Supremacist’ Movements,” Terrorism and Political Violence 1, no. 4 (1989): 409–34; Betty A. Dobratz, and Stephanie L. Shanks-Meile, “White Power, White Pride!”: The White Separatist Movement in the United States (Woodbridge, CT: Twayne Pub, 1997); Jeffrey Kaplan, “Right Wing Violence in North America,” Terrorism and Political Violence 7, no. 1 (1995): 44–95.

94 Federal Bureau of Investigation, Homegrown Violent Extremist Mobilization Indicators 2019 Edition (Washington, DC: Office of the Director of National Intelligence, 2019).

95 See Bowman-Grieve, “Exploring “Stormfront;” see also De Koster and Houtman, “‘Stormfront is Like a Second Home to Me.’”

96 See Conway et al., “Right-Wing Extremists’ Persistent Online Presence;” see also Daniels, Cyber Racism.

97 Barkun, “Millenarian Aspects of ‘White Supremacist’ Movements”; Dobratz and Shanks-Meile, “White Power, White Pride!”; Raphael S. Ezekiel, The Racist Mind: Portraits of American Neo-Nazi and Klansmen (New York: Viking Penguin, 1995); Kaplan, “Right Wing Violence in North America;” Jeffrey Kaplan, “Leaderless Resistance,” Terrorism and Political Violence 9, no. 3 (1997): 80–95.

98 Bowman-Grieve, “‘Exploring ‘Stormfront’”; Daniels, Cyber Racism; Holt et al., “Examining the Online Expression of Ideology Among Far-Right Extremist Forum Users”; Scrivens et al., “Measuring the Evolution of Radical Right-Wing Posting Behaviors Online”; Scrivens, “Exploring Radical Right-Wing Posting Behaviors Online;” Scrivens et al., “Examining Online Indicators of Extremism in Violent Right-Wing Extremist Forums.”

99 Jacob Davey, Mackenzie Hart, and Cécile Guerin, An Online Environmental Scan of Right-Wing Extremism in Canada (London: Institute for Strategic Dialogue, 2020).

100 Jacob Davey and Julia Ebner, The Fringe Insurgency: Connectivity, Convergence and Mainstreaming of the Extreme Right (London: Institute for Strategic Dialogue, 2017); Zannettou et al., “A Quantitative Approach to Understanding Online Antisemitism.”

101 Holt et al., “Examining the Online Expression of Ideology Among Far-Right Extremist Forum Users.”

102 Barkun, “Millenarian Aspects of ‘White Supremacist’ Movements”; Kaplan, “Right Wing Violence in North America;” Simi, “Why Study White Supremacist Terror?”.

103 Scrivens, “Examining Online Indicators of Extremism Among Violent and Non-Violent Right-Wing Extremists.”

104 Holt et al., “Examining the Online Expression of Ideology Among Far-Right Extremist Forum Users”; Scrivens et al., “Examining Online Indicators of Extremism in Violent Right-Wing Extremist Forums.”

105 Berger, The Alt-Right Twitter Census.

106 Hutchinson, et al. “Mobilizing Extremism Online”; Scrivens and Amarasingam, “Haters Gonna “Like.”

107 Tiana Gaudette, Ryan Scrivens, Garth Davies, and Richard Frank, “Upvoting Extremism: Collective Identity Formation and the Extreme Right on Reddit,” New Media and Society 23, no 12 (2021): 3491–508.

108 Scrivens, “Examining Online Indicators of Extremism Among Violent and Non-Violent Right-Wing Extremists.”

109 See Scrivens et al., “Comparing Online Posting Typologies among Violent and Nonviolent Right-Wing Extremists.”

110 See Scrivens, “Examining Online Indicators of Extremism Among Violent and Non-Violent Right-Wing Extremists.”

111 See, for example, Barbara Perry and Ryan Scrivens, Right-Wing Extremism in Canada (Cham, Switzerland: Palgrave, 2019).

112 Ibid.

113 Bliuc et al., “Collective Identity Changes in Far-Right Online Communities”; Leap and Becker, “The Not-So-Silent “Majority”; Scrivens et al., “Triggered by Defeat or Victory?”; Scrivens and Amarasingam, “Haters Gonna “Like.”

114 Scrivens, “Examining Online Indicators of Extremism Among Violent and Non-Violent Right-Wing Extremists.”

115 Scrivens in “Examining Online Indicators of Extremism in Violent Right-Wing Extremist Forums.”

116 Scrivens, “Examining Online Indicators of Extremism Among Violent and Non-Violent Right-Wing Extremists”; Scrivens in “Examining Online Indicators of Extremism in Violent Right-Wing Extremist Forums.”

117 Scrivens, “Examining Online Indicators of Extremism Among Violent and Non-Violent Right-Wing Extremists.”

118 See Ibid.

119 Gerstenfeld et al., “Hate Online;” Perry and Scrivens, Right-Wing Extremism in Canada; Simi and Futrell, American Swastika.

120 Scrivens, “Examining Online Indicators of Extremism Among Violent and Non-Violent Right-Wing Extremists.”

121 Scrivens et al., “Comparing Online Posting Typologies among Violent and Nonviolent Right-Wing Extremists.”

122 For more on these limitations, see Scrivens et al., “Comparing the Online Posting Behaviors of Violent and Non-Violent Right-Wing Extremists”; see also Scrivens et al., “Comparing Online Posting Typologies among Violent and Nonviolent Right-Wing Extremists.”

123 Horgan et al., “Actions Speak Louder Than Words”; Jasko et al., “Quest for Significance and Violent Extremism”; Knight et al., “Comparing the Different Behavioral Outcomes of Extremism”; LaFree et al., “Correlates of Violent Political Extremism in the United States.”

Appendix.

Codebook

Sample group: ___________________ Post ID: ___________________

Expressed Ideologies

  1. Expresses anti-immigrant ideologies (e.g., criticizes high levels of immigration).

    1a. __ No __ Yes (binary 0 = no, 1 = yes)

  2. Expresses anti-Semitic ideologies (e.g., hostility and prejudice against Jewish people).

    2a. __ No __ Yes (binary 0 = no, 1 = yes)

  3. Expresses anti-Black ideologies (e.g., hostility and prejudice against Black people).

    3a. __ No __ Yes (binary 0 = no, 1 = yes)

  4. Expresses anti-LGBTQ ideologies (e.g., hostility and prejudice against LGBTQ people).

    4a. __ No __ Yes (binary 0 = no, 1 = yes)

  5. Expresses anti-Muslim ideologies (e.g., hostility and prejudice against Muslim people).

    5a. __ No __ Yes (binary 0 = no, 1 = yes)

  6. Expresses anti-government ideologies (e.g., hostility and prejudice against a government or administration in office).

    6a. __ No __ Yes (binary 0 = no, 1 = yes)

  7. Expresses Christian Identity ideologies (e.g., a radical interpretation of Christianity that promotes white supremacy).

    7a. __ No __ Yes (binary 0 = no, 1 = yes)

  8. Expresses male supremacy ideologies (e.g., advocating for the subjugation of women).

    8a. __ No __ Yes (binary 0 = no, 1 = yes)

  9. Expresses neo-folkish ideologies (e.g., advocating for paganism and Viking culture, which are cemented in ethnocentricity and outdated notions of gender).

    9a. __ No __ Yes (binary 0 = no, 1 = yes)

  10. Expresses conspiracy theories (e.g., beliefs that a covert but influential organization is responsible for a circumstance or event, which are steeped in white supremacy).

    10a. __ No __ Yes (binary 0 = no, 1 = yes)

Expressed Violent Extremist Mobilization Indicators

  1. Mentions end of life preparations (e.g., discuss making a will/statements related to ending life).

    11a. __ No __ Yes (binary 0 = no, 1 = yes)

  2. Mentions seeking help (asking for money) to travel abroad.

    12a. __ No __ Yes (binary 0 = no, 1 = yes)

  3. Mentions planning a trip abroad.

    13a. __ No __ Yes (binary 0 = no, 1 = yes)

  4. Appears to be seeking permission to engage in violence.

    14a. __ No __ Yes (binary 0 = no, 1 = yes)

  5. Appears to be seeking to recruit others to mobilize.

    15a. __ No __ Yes (binary 0 = no, 1 = yes)

  6. Appears to be asking how to purchase/how to obtain illegal material (e.g., explosive precursors).

    16a. __ No __ Yes (binary 0 = no, 1 = yes)

  7. Contains terrorist icons/flags/prominent figures/symbols/slogans (e.g., 14/88, HH [Heil; Hitler], WPWW [White Pride World Wide]).

    17a. __ No __ Yes (binary 0 = no, 1 = yes)

  8. Expresses goodbyes.

    18a. __ No __ Yes (binary 0 = no, 1 = yes)

  9. Expresses acceptance of violence as a necessary means to achieve ideological goals.

    19a. __ No __ Yes (binary 0 = no, 1 = yes)

  10. Attempts to radicalize others/pushing others to action.

    20a. __ No __ Yes (binary 0 = no, 1 = yes)

  11. Appears to be involved in a group that promotes violence to rectify grievances.

    21a. __ No __ Yes (binary 0 = no, 1 = yes)

  12. Provides virtual simulations of an attack/assault.

    22a. __ No __ Yes (binary 0 = no, 1 = yes)

  13. Discusses behavioral change.

    23a. __ No __ Yes (binary 0 = no, 1 = yes)

  14. Uses linguistic expressions that reflect new sense of purpose.

    24a. __ No __ Yes (binary 0 = no, 1 = yes)

  15. Advocates/encourages violence.

    25a. __ No __ Yes (binary 0 = no, 1 = yes)

  16. Asks for information about specific targets.

    26a. __ No __ Yes (binary 0 = no, 1 = yes)

  17. Asks for technical expertise.

    27a. __ No __ Yes (binary 0 = no, 1 = yes)

  18. Contains a violent, ideologically motivated outburst.

    28a. __ No __ Yes (binary 0 = no, 1 = yes)

  19. Blames external factors for failure in school, career or relationships.

    29a. __ No __ Yes (binary 0 = no, 1 = yes)

  20. Displays an unstable mental state.

    30a. __ No __ Yes (binary 0 = no, 1 = yes)

  21. Discusses operational security and asks about ways to evade law enforcement.

    31a. __ No __ Yes (binary 0 = no, 1 = yes)

  22. Praises past successful/attempted attacks.

    32a. __ No __ Yes (binary 0 = no, 1 = yes)

  23. Inappropriate use of what an individual perceives as doctrine to manipulate others (e.g., criticizing parents; promoting reading material, musical choices, religious practices).

    33a. __ No __ Yes (binary 0 = no, 1 = yes)