Publication Cover
Internet Histories
Digital Technology, Culture and Society
Volume 7, 2023 - Issue 4
1,190
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

The Wikipedia imaginaire: a new media history beyond Wikipedia.org (2001–2022)

Pages 333-353 | Received 21 Sep 2022, Accepted 06 Aug 2023, Published online: 12 Aug 2023

Abstract

This paper presents a media biography of Wikipedia’s data that focuses on the interpretative flexibility of Wikipedia and digital knowledge between the years 2001 and 2022. To do so, I not only follow a strand of media historians who argue that the imagination is an important component for understanding how media change, but I also argue that Wikipedia’s data has been incorporated, re-imagined, and repurposed by sociotechnical projects in ways that have often been side-lined despite acting as the boundary lines of what is considered digital knowledge. It combines Patrice Flichy’s longitudinal theory of technical development as an imaginaire, Frederik Lesage and Simone Natale’s historical approach of biographies of media with an analysis of the interpretative flexibility of new media. Through an eclectic corpus of project websites, new articles, press releases, and blogs, I demonstrate the unexpected ways the online encyclopedia has permeated throughout digital culture over the past twenty years through projects like the Citizendium, Everipedia, Google Search and AI software. As a result of this analysis, I explain how this array of meanings and materials constitutes the Wikipedia imaginaire: a collective activity of sociotechnical development that is fundamental to understanding the ideological and utopian meaning of knowledge with digital culture.

Wikipedia is more consequential to the history of the web than its status as an encyclopedia. Its data is woven deep into the fabric of how we imagine the relationship between knowledge and digital culture. But what exactly does this mean? Consider the usual character arc of the site: it began as an underdog volunteer-based project in 2001 and over the next ten to fifteen years, it became one of most popular websites on the Internet. Embedded within this story is the drama of how the encyclopedia and its attendant community transformed the meanings and practices associated with collaboration (Reagle, Citation2010), participation (Benkler, Citation2006; Bruns, Citation2008), autonomy (O’Neil, Citation2009), openness (Tkacz, Citation2014) and consensus (Jankowski, Citation2022). While it is important to understand how Wikipedia has been a harbinger, stand-in, and site of struggle over these meanings and respective Wikipedian users, it is time to consider new questions. What is a new media history of Wikipedia when we examine it—not from its center—but from its heterogenous couplings with other web projects? Which meanings surround Wikipedia when we move past Wikipedia’s status as a platform or a community? What do we learn about its utopian and ideological connections when it is not a Wikimedia project, but an imaginaire?

Leaving the confines of wikipedia.org is a labyrinthine venture. Unlike other social media, Wikipedia technically allows any user to download its entire database (Wikipedia, Citation2002) and legally permits “its contents to be modified and reproduced without seeking permission or remunerating the prior ‘authors’” (Tkacz, Citation2014, p. 4). This situation is further amplified by Wikidata, a separate Wikimedia project that scrapes Wikipedia’s metadata and provides it free for third parties such as Google. As such, the Wikipedian meaning of “free” is not limited to providing unpaid access to encyclopedic knowledge. Its content is designed to be reproduced and repurposed as data that exist outside of the intentions of its contributors, and in turn, has since been copied and redeployed for completely different epistemological purposes within projects that attach different, often contradictory, meanings. When viewed from the perspective of the data of Wikipedia, we can no longer limit ourselves to the scope of one project. It becomes the Conservepedia, the Citizendium, Everipedia, Pediapress, V for Wiki, Google, OpenAI, and many others. With this perspective in mind, I argue that this array of project-based meanings constitutes the Wikipedian imaginaire: a collective activity of sociotechnical development that is fundamental to understanding the ideological and utopian meaning of knowledge with digital culture in the beginning of the twenty-first century.

By pursuing this line of inquiry, I align myself with several historians who approach “media not as clearly defined objects, but as shifting practices, discourses, technical configurations, and cultures” (Park et al., Citation2011, p. xiii). In the context of histories about the web, there are several approaches to address this shiftiness. For example, a platform historiography accounts for changes among the multiple sides of a platform’s users/stakeholders and its multiple layers of technical materials (Helmond & van der Vlist, Citation2019), while another is to concentrate on media as the basis of myth and narrative, since “technical objects such as networks, […] are themselves narratives; they communicate something to us” (Bory, Citation2020, p. 4, emphasis original). These two approaches reflect the complementary approaches of writing history through the changes in the “life history of a medium” and writing history through the continuity of “narratives” (Lesage & Natale, Citation2019, p. 577, 579). Both are valuable methods, but Frederik Lesage and Simone Natale advocated for their synthesis in the form of producing biographies of media: a method that includes making “connections across otherwise heterogeneous media landscapes” while also examining narratives that are “entangled in relations of power and resistance” (p. 579, 581).

In this paper, I apply this approach of biography of media to Wikipedia, but I do so with an additional attention to how the cultural meaning of Wikipedia’s knowledge is shaped by other technical projects that use its data. I suggest that this process is akin to what historians have described as new media’s condition of newness, or always being “renewed” (Balbi, Citation2015, p. 245; Peters, Citation2009, p. 23). This terminology attends to the concern for how change and continuity within media need to be balanced, so that it is clear that “technologies have longer histories and operate against a background of long lasting, but evolving structures, cultures, and materiality” (Driessens, Citation2023, p. 34).

One way of addressing the relationship between change and continuity is to speak of interpretative flexibility. Approached from the view that technology is socially constructed, this concept highlights how different social groups attach different meanings to the emergence of the same technical innovation (Pinch & Bijker, Citation1984). Within media history, this concept has since been operationalized to explain how new media stimulate “an array of different conceptions” and provide “contexts in which competing ideas of how a medium may develop are conceived and discussed” (Natale & Balbi, Citation2014, pp. 208–209), either through metaphors or desires. As Sally Wyatt explained, media metaphors can be analyzed to identify the variety of hopes and fears associated with a medium (2021, p. 410), as well as “reveal the political assumptions and aspirations of those who deploy them” (2004, p. 245). Somewhat similarly, interpretative flexibility may also be expressed in terms of how media can exist as devices that “mediate impossible desires” (Kluitenberg, Citation2011, p. 48): such as the desire to communicate with the divine, the spirit world, or the “Other;” to transcend space, absence, and time; or to provide abundance and deliverance (pp. 57–66). In combination, metaphors like the information superhighway or village square, are significant because they draw on the continuity of impossible desires that are attached to the Internet—as Gabriele Balbi noted (2015, p. 233)—such as the erasure of distance or a perfected form of direct democracy.

However, while metaphors and impossible desires are analytically useful for identifying the continuity of narratives across media, they also raise an analytical problem. “Metaphors,”—and I include impossible desires as well—“are available to all” (Wyatt, Citation2021, p. 410). The traces of interpretative flexibility therefore provide waypoints within a topography of meaning, but on their own, they do not necessarily describe how these desires are attached and decoupled to media over time. To address how the meaning of media shift throughout their life cycles, one can turn to different periodization schemes such as Ben Peters’ (Citation2009) five overlapping stages of media renewability (technical innovation, cultural innovation, legal regulation, economic distribution, and social mainstreaming) or Natale and Balbi’s (Citation2014) trinity of media and imagination (media prophecy, new media interpretative flexibility, and fantasies of obsolescence). For my purposes, I will rely on Patrice Flichy’s cyclical diagram of the technical imaginaire and its analytical focus on projects (2007, p. 9–10).

An “imaginaire” is a collective vision “common to an entire profession or sector,” which may take the form of utopias and ideologies that “play a part in the creation of technical systems” (Flichy, Citation2007, p. 4). As such, this interplay of utopian (possible) and ideological (entrenched) meanings is symmetrically reflected in the status of the technical object as a “catch-all” and “boundary” object of experimentation or a “locked-in” technical system (p. 10). These different objects are also associated with a bifurcation of different stages of technical development. On one side are the utopian watershed projects, utopian projects that become models, and projects of phantasmagoric escape. On the other side are projects which mask, legitimize, and mobilize ideology (p. 10). Beyond the characteristics of utopia and ideology, Flichy’s model is also a diagram of temporality, with the long-term dimension of development being captured by the totality of utopian and ideological meanings attached to the materials enlisted for each technical object—in other words, the imaginaire. The short-term dimension of the imaginaire is denoted by each project which “is the place in which a new technical device is formulated” (p. 3). It is through these accumulation and connection of projects that produce Flichy’s notion of the imaginiare.

Based on this review of concepts I consider useful for conducting a biography of a medium, I have arrived at the following theoretical description. I adapt the concept of the imaginarie as a conceptual framework for analyzing short-term sociotechnical projects that produce an array of competing utopian and ideological interpretations as impossible desires that are attached to a set of (catch-all, model, boundary, locked-in) objects. In the case of my analysis, I start with Wikipedia’s tagline which promises an online and free encyclopedia that anyone can edit. I then proceed to explore sociotechnical projects that compete with these promises by transforming the boundary object of Wikipedia’s data into projects that engage with impossible desires that make this data not useful, not online, not free, not for anyone, and not editable. The resulting distribution of utopian and ideological interpretations of Wikipedia is what I call the Wikipedia imaginaire, an array of utopian and ideological meanings attached to the sociotechnical projects for storing, processing, and circulating abstracted Wikipedia knowledge ().

Figure 1. The Wikipedia imaginaire: an array of meanings associated Wikipedia and the sociotechnical projects that have used its data.

Figure 1. The Wikipedia imaginaire: an array of meanings associated Wikipedia and the sociotechnical projects that have used its data.

Method and sources

Within media history, Carolyn Marvin’s approach to studying “textural communities” (1988, p. 12) has set a literary model for examining the relationship between media and the imagination. In particular, she did so by “examining magazines that mainly targeted expert readers” (Natale & Ballatore, Citation2020, p. 6) which provides insight into how a group of magazines produce a particular set of heavily circulated meanings about a new medium. In contrast, other scholars have begun with popular metaphors and observed how they are articulated through a range of different newspapers and magazines (Puschmann & Burgess, Citation2014). Regardless of whether the research object is the publication or the metaphor, these scholars have chosen popular textual sources to provide evidence of changes and continuities in popular meanings. My study borrows from this method but takes it in a different direction. Instead of a focus on popularity, I am interested in an array of eclectic meanings, which therefore requires eclectic sources. This follows what Mary Franklin-Brown argued about collecting sources for her study of medieval encyclopedias. She argued that it required “the exploitation of a most eclectic group of source texts, a model for anyone attempting to take full account of (rather than reduce) the eclecticism” of the topic being studied (2012, p. 20).

With this principle of eclecticism, my choice of an eclectic group of projects emerged from my research on a broader research project about Wikipedia. During the years 2014 and 2022, I made notes concerning secondary sources, bookmarked project websites, and kept track of news articles about projects that re-articulated the meaning of Wikipedia. In October of 2021, I revised the list of projects and sought out a variety of primary sources using Google Search. I kept track of my queries using myactivity.google.com which included 70 separate keyword searches and visiting 60 search results links between 26 October and 5 November 2022. For projects and articles that were no longer available, I used the Internet Archive’s Wayback Machine (IAWM) and chose the snapshot that was closest to the time-period I was describing. Together, I created an eclectic corpus of project websites, news articles from popular and niche publications, press releases, and blogs.

This reliance on Google Search and the IAWM comes with specific concerns of historicity of “digital born materials” (Brügger & Finnemann, Citation2013). For example, Google privileges “certain information sources over others” and it is not a complete search of the entire web (Rogers, Citation2013, p. 109). An additional concern is that online news is stored according to changing archival practices, and therefore old articles become “a novel artefact of its archiving process” (Rogers, Citation2017, p. 164). In some cases, websites suffer from the web’s “uneven maintenance” and the IAWM can be used to refer to periodic “snapshots” or capture of archived websites (Rogers, Citation2017, p. 164, 162). However, the IAWM does not record “1:1” copies of these sites as the system often excludes different media types (Brügger, Citation2009, p. 125) and does not render depreciated code. Additionally, the IAWM provides information about each capture archived by both humans and bots (Alexa Crawls), and this distinction matters for historians who are assessing the provenance of their sources and the curatorial choices of the archive (Ben-David & Amram, Citation2018).

Considering these conditions of finding texts for analysis, my search for eclectic sources was not exhaustive, and was shaped by Google Search’s ranking system and IAWM’s archival technqiues. This means that six of IAWM’s captures were automatically archived by Alexa Crawls, three by the Internet Archive, one was archived using the “Save Page Now” extension. Finally, in the case of Wikipedia and the Conservapedia, each edit on these sites is added to an archive which records the date, the change, and the user who initiated the edit—although, these users may be anonymous or pseudonymous. For these reasons, I reference Wikipedia and the Conservapedia pages by using their archived URL that points to the specific version of the page, rather than the current version.

Wikipedia: from utopia to ideology

It is difficult to overstate how Wikipedia has transformed the popular imagination of encyclopedias. This is in part because when Wikipedia entered the scene, the Encyclopedia Britannica had served as the long-term model of the genre, with it having “come to define what an encyclopedia is” (Yeo, Citation2001, p. 170). Two twentieth century editors of the Britannica also took note of this expectation among readers when they explained that “[t]oday most people think of an encyclopaedia as a multivolume compendium of all available knowledge, complete with maps and a detailed index, as well as numerous adjuncts such as bibliographies, illustrations, lists of abbreviations […,] alphabetically arranged contents will have been written in their own language by many people and will have been edited by a highly skilled and scholarly staff” (Preece & Collison, Citation2016).

By 2005, Wikipedia’s experiments with creating a free encyclopedia had begun to take hold of the popular imagination as a project utopia which is a “formalized schema of a technique” (Flichy, p. 9). For Wikipedia, this technique was largely built on the sociotechnical capacities of wiki software and attached to the desire of allowing anyone to edit an encyclopedia. This was clearly articulated when Wikipedia co-founder’s Jimmy Wales asked users to “[i]magine a world in which every single person on the planet is given free access to the sum of all human knowledge” (Miller, Citation2004)—they did. But as Christian Pentzold investigated, Wikipedia is more than a digital platform for editing encyclopedic content; Wikipedia is imagined by its editors as a “community,” a space where “membership is based on compliance” to “appropriate beliefs, values, common understandings and practices” (2011, p. 718).

For Nicholas Carr, a member of the Encyclopaedia Britannica’s editorial board of advisors, it was precisely because Wikipedia was based on the “wisdom of the crowd” of its community that he argued it could not be an encyclopedia (Carr, Citation2005, Citation2006). His criticism aligned with a similar grievance with Wikipedia; that without “a centralized team of authoritative experts and editors,” it was seen as “nothing more than an unusually unvarnished avatar of the marketplace of ideas” (Leitch, Citation2014, p. 59). In some ways, the basis of these critiques mirrored comments made by Ward Cunningham (the developer of the wiki software), danah boyd (a social media scholar) and Cory Doctorow (a technology journalist and author), each of whom argued that the value of Wikipedia came from not being an encyclopedia, but something altogether different (Reagle, Citation2019; Westerman, Citation2009, p. 151).

To ensure the longevity of this project utopia, Wikipedia became financially, legally, and institutionally supported by the Wikimedia Foundation (WMF) in 2003. While the relationship between the WMF and Wikipedian community is sometimes antagonistic in terms of disagreements over the organizational structure of the foundation and its relationships to business interests (Kostakis, Citation2010), the foundation has provided the framework for legitimizing the authority of the encyclopaedia by arranging significant partnerships with universities, museums, and libraries (Lampe et al., Citation2012; Wikipedia, Citation2021).

While the early years painted Wikipedia as an utopian alternative to the Britannica, the year 2011 brought a shift in the tenor of critique when it was recognized that 80% of Wikipedia’s contributors were men, and that this created the conditions for reinforcing heteronormative gender roles (Reagle & Rhue, Citation2011; Ford & Wajcman, Citation2017). In line with Flichy’s terminology, Wikipedia’s utopian project of inclusivity—when put into practice—had become a “mask ideology” for an exclusive form of digital knowledge created by White Western cismen. As the WMF attempted to address these concerns, Wikipedia continued to garner the status as the de facto encyclopedia of the Internet as the sixth to thirteenth most popular website on the Internet (Alexa.com, Citation2012, Citation2021), with daily pageviews on English Wikipedia reaching a 3-month average of 235,735,916 and a total of 42 million users (Toolforge.org, Citation2021).

As a result of two decades of this collective experiment, Yochai Benkler observed that Wikipedia was not only “as good and imperfect as any encyclopedia,” but that it had become “the basic knowledge utility of contemporary society” (2019). While it was once seen as a radical departure from the genre, Wikipedia is relatively mundane. This banality of Wikipedia today suggests that the medium has become locked-in and legitimized as a commonplace medium for communicating knowledge.

If I stopped here, this story of development—from an uncertain underdog to an institutionally reinforced organization—may present a straight arrow between Wales’s project utopia to a locked-in internet utility. But the reality is that in the blinding light of Wikipedia’s success, it is easy to forget the ways it has been challenged since “every medium is tested in the early phases of its evolution for purposes that are later abandoned” (Balbi, Citation2015, p. 238). What those purposes were and why they have been forgotten do more than act as footnotes in Wikipedia’s history. There are consequential for understanding the technical development of digital knowledge and its cultural context.

The free encyclopedia: a “catch-all” sociotechnical object (1999–2002)

Encyclopedists have a long tradition of describing and subtitling their encyclopedias. During the eighteenth and nineteenth centuries, it was not uncommon for an encyclopedia to self-identify as universal, complete, general, and systematic (see Yeo, Citation2001). When Wikipedia launched, its co-founders Jimmy Wales and Larry Sanger followed this tradition when described it as “the free encyclopedia.” In this case, their attachment to the concept of free had a very particular meaning, one that was inspired by Richard Stallman, a member of the GNU organization and Free Software movement of the 1990s (Reagle, Citation2010, p. 37). In 1999, Stallman expressed his impossible desire for the web: it had “the potential to develop into a universal encyclopedia covering all areas of knowledge, and a complete library of instructional courses” (Stallman, Citation1999). However, his call was urgent. Stallman wanted to “ensure that the web develops toward the best and most natural outcome, where it becomes a free encyclopedia,” before any alternative to a commercialized web of knowledge was rendered incomprehensible. He further explained that it could be decentralized, written by anyone, and composed of suitable topics. Technically, it would permit universal access, mirror sites, translations, redistributions, modifications—and of course—run on free software (Stallman, Citation1999). While this proposal morphed into the GNUPedia project in 2001, Stallman shifted its focus away from being an encyclopedia that same year because he wanted to allow Bomis’s Nupedia the space to thrive (Reagle, Citation2010, p. 38).

Both encyclopedias shared a commitment to the principle of free content as the GNU Free Documentation License which enabled users to not only freely read the work, but also freely improve it (p. 4). Wikipedia, which ran in parallel with Nupedia before superseding it, demonstrated how the particulars of free access were not settled since the site was “conceived by Wales as a possible commercial undertaking” (Reagle, Citation2019). This was supported by the fact that both cofounders—Jimmy Wales and Larry Sanger—were considering selling ads to pay the staff who maintained the site; a fact inscribed within the original domain name: wikipedia.com, not .org (Sanger, Citation2005). When Sanger publicized the possibility of advertising in 2002, Wikipedians vocally opposed him. It even prompted contributors to the Spanish Wikipedians to leave en masse (Reagle, Citation2019) and created the Spanish fork called the Enciclopedia Libre Universal (Tkacz, Citation2014). Indeed, a year after the fork began, Wikipedia moved to a .org domain and became officially organized by the newly founded non-profit Wikimedia Foundation which “operated as a fundraising tool to sustain the infrastructure” (Morell, Citation2011, p. 329).

What is important about this brief sketch is it is clear the idea of a free encyclopedia is a “catch-all” sociotechnical object that each project was working on, but from the opposing economic perspectives that focus on the public domain, open source, and commercial enterprises. This period also matches what Flichy described as the gestation stage of innovation where the “projects conceived of here are widely diverse, often opposed, sometimes simply juxtaposed” (p. 9). This statement remains true for the next set of experiments, where instead of economic concerns, the catch-all object of the free encyclopedia was drawn into the model set by Wikipedia, and this model was then applied to the cultural politics of truth and neutrality.

The truthful Wikipedias: copying the model (2002–2011; 2014–2022)

Sanger left Bomis after the conflict of the Spanish fork in 2002 (Reagle, Citation2019), but he did not stop pursuing what he believed to be the ideal free encyclopedia. He fundamentally did not agree with the way that Wikipedia’s represented an “epistemic egalitarianism” where “Truth” was placed “in the service of Equality” (Sanger, Citation2007). It was through this opposition that he extended his work with Nupedia and Wikipedia to legitimize the capacity of experts to create a reliable source of information. But such a task, to convey encyclopedic truth, was difficult.

The Citizendium launched in 2006 with Sanger at its helm. He described it as an “encyclopedia-like” project—a “compendium”—that invited experts to contribute as editors who would edit using their “real name” (Anderson, Citation2007; Sanger, Citation2006). Despite these sociotechnical differences with Wikipedia, the project was not cut from whole-cloth. Not only did it use the same MediaWiki software as Wikipedia, follow the same license for content, and include the policy of neutrality, but it was also a fork of Wikipedia’s content database (Sanger, Citation2006). However, the site was partially “unforked” in 2007 when the community decided that maintaining the integrity of the site’s vision meant keeping only those articles that had been edited by Citizendium users (Lee, Citation2011). While the Citizendium continues to operate at the time of the writing, signs of a decline in participation were identified on the site nearly a decade ago, around the same time that Sanger stopped his own contributions (Lee, Citation2011).

During the rise of the Citizendium, there was a growing concern within American conservative and Christian circles about Wikipedia’s approach to authority. Despite the claims and policies of neutrality, Wikipedia’s articles were seen by some users biased toward liberal ideals. This concern inspired Andy Schlafly to create the Conservapedia, a project designed in 2006 for home-schooled children (Johnson, Citation2007). Like the Citizendium, it denied editors from remaining anonymous and it requested that each username was in some way representative of the user’s “real name” (Conservapedia, Citation2009a, Citation2009b). However, the new encyclopedia did not follow the Citizendium’s strategy of creating a fork. Instead, it used the MediaWiki software as a base and developed its own policies and content. Another point of distinction with Wikipedia was the bureaucratic policy structure. In this case, this right-wing encyclopedia reimagined wiki governance as a set of simplified “commandments,” with the first rule being “[e]verything you post must be true and verifiable” (Conservapedia, Citation2006). The preference for a command-based structure of social organization was also reflected in the adjunctive guidelines page that outlined how users should approach collaboration; under “Teamwork,” the first bullet point of the guideline instructed users to “Let others boss you around” (Conservapedia, Citation2009a). Each of these features add up to a distinct outlook on digital knowledge. It did not share the impossible desire of creating a single global community. Instead, it was designed to reinforce the premise that the world was fundamentally a divided one, a world where this split was represented by two encyclopedias, one that was for Christian Americans and one that was not.

Far from the epistemologically restrictive approach of creating an encyclopedia like the Conservapedia or the Citizendium, was Everipedia. Launched in 2014, this start-up contender to Wikipedia first began as an encyclopedia of everything and ended with its rebranding as IQ.wiki in 2022 when it pivoted to “a hub for cypto knowledge” (IQ.wiki, Citation2022). The impetus for this encyclopedia came Sam Kazemian and Theodor Forselius, two students from UCLA who forked Wikipedia with the intent to redesign it to address the problems of “deletionism, poor mobile editing options, and a lost spirit of inclusiveness” (James, Citation2017). On this first concern, deletionists believed that maintaining the integrity of the encyclopedia as a credible source of information required conservative assessments of what counted as notable topics. The result was an active and consciousness effort to delete articles that did not meet criteria of notability (Kostakis, Citation2010). On the other hand, the inclusionist perspective proposed that more articles that existed, the better the encyclopedia. It was this second approach to content production that manifested on Everipedia in the form of a less restrictive notability criterium: so long as a page had a citation, it could remain (James, Citation2017). In 2017, Larry Sanger returned to the digital encyclopedia scene when he joined Everipedia as its Chief Information Officer. On the topic of its competitor, he argued that this new system improved upon Wikipedia’s model because it created more space for neutrality and allowed people to “speak with their own biases in a completely neutral, technically managed network with no editorial policy” (Munster, Citation2019).

An additional feature of Everipedia that distanced itself from Wikipedia was that it redesigned the socio-technical mechanisms for settling debates: a game-like system where approved edits are rewarded with points, or IQ (Rubin, Citation2017). This system of gamifying encyclopedic production was integrated into the EOS blockchain (Moghadam, Citation2017) where each edit action and its content were stored as part of a decentralized ledger shared across a network rather than a centralized set of servers. Under this new system, the IQ points were repurposed as a token, and these tokens were used during 12-h long voting opportunities to accept or deny new articles, edits, proposals, and rules to be added to the ledger (Everipedia, Citation2019, Citation2021). To make an edit, users offered an amount of their tokens and if the edit was denied, their tokens were not returned (Rubin, Citation2017). Likewise, the accumulation of edits-turned-tokens granted users the ability to steer the shape of the encyclopedia. These features therefore introduced the idea that contributing to the encyclopedia was both a risk and a stake. This point was reinforced by the fact that these Everipedia tokens could also be traded on cryptocurrency exchanges (AIT News Desk, Citation2020). The result was that through a series of conversions, editors could exchange their edits for fiat money—but only in direct proportion to what the IQ token was worth on the market. Therefore, the move to a blockchain infrastructure was tied to the desire for connecting knowledge to wealth, and Everipedia existed as both a technical system based on the ideals of decentralization and user ownership.

However, since Everipedia was a fork of Wikipedia, this form of financialization raised important questions. As Christopher Cox (Citation2019) explained, it “commodifies Wikipedia pages and the process of their creation,” including the volunteered labour of Wikipedians. To be clear, this was not seen as a bug by Everipedia’s designers. In a whitepaper describing its peer-to-peer encyclopedia network, the cofounders announced that “one clear shortcoming Wikipedia has demonstrated is its inability to capture any of the monetary and intrinsic value of content that its platform and community has created” (Kazemian, Iyer, Moore, Forselius, & Sanger, Citation2018). Everipedia therefore imagined that its contribution to the world’s knowledge was in monetizing the free labour of Wikipedians. In doing so, they also engaged with the impossible desires of the market: that differences of opinion over knowledge could be, and should be, settled by pricing mechanisms that only function through the monetary accumulation.

In terms of Flichy’s terminology, it was clear that Wikipedia’s experimentation with the catch-all object of a free encyclopedia had been a success: it was not only the technical model for other encyclopedias, but it had also become the center of the free encyclopedia project. And because of the design of Wikipedia as a GNU licensed wiki, it facilitated copycat projects to draw on its resources. This condition was clearly understood by Flichy when he argued that the project utopia is easier to produce on the web because “software can be duplicated” and so the iteration of a prototype is, relatively speaking, “one easy step from design to use” (2007, p. 10).

The presence of these three project utopias also hardened the importance of a select set of meanings about Wikipedia. From the perspective of the Citizendium, Wikipedia’s commitment to epistemological neutrality needed to be strengthened by relying even more on the authority of experts. For the Conservapedia, Wikipedia was ontologically incapable of being neutral. In the worldview that underpinned this project, Wikipedia reflected a liberal (and therefore, sinful) representation of the truth. From a different angle, Everipedia imagined that Wikipedia was still limited in its ability to provide the freedom to produce and circulate knowledge. The remedy its creators proposed was to employ the supposedly neutral mechanisms of the market to reduce the deleterious effects of Wikipedian production. As such, these points of contrast and conflict between these projects and Wikipedia made specific conditions of online knowledge culturally significant. These differences emphasized how desires for truth, faith, and money could be integrated as design philosophies for encyclopedic gate-keeping, commandments, and markets.

The opaque Wikipedias: mask ideology (2006—)

Everipedia’s reading of Wikipedia’s non-profit status as a “short-coming” reveals just how far its vision had diverged from Stallman’s original dream of a free encyclopedia. However, despite Wikipedia’s ability to create firm barriers to commodification within its own site, Everipedia is a case study in how Wikipedia’s license permits external activities that are not in keeping with Wikipedian practices. This techno-legal status of Wikipedia’s data also complicates one of the early explanations of Wikipedia’s emergence. In his 2010 book, Joseph Reagle described Wikipedia as “[b]orn almost as a happy accident, growing far beyond anyone’s expectations” (p. 173). Such a reading makes sense if one stays within the confines of Wikipedia. But if one examines how Wikipedia became integral to the business plans of Silicon Valley corporations, its success begins to look less accidental and more strategic. In these cases, Wikipedia’s design principles of transparency were flipped into opaque black boxes, systems “whose workings are mysterious” and shrouded in corporate control (Pasquale, Citation2015, p. 3).

For example, Google began ranking Wikipedia high in its search results by at least 2006, a decision that was clearly to their mutual benefit since Wikipedia received more traffic (and potential editors), and Google could rely on it to provide useful and comprehensive results for its own users (Van Dijck, Citation2013, p. 151). Google recognized that Wikipedians had “worked out norms and processes for neutralizing controversial content and contentious topics, a quality that aids Google’s search engine value” (p. 151) and demonstrated on YouTube (which is owned by Google/Alphabet) where videos posted by different news organizations have been contextualized with Wikipedia links (Matsakis, Citation2018).

Another kind of opaque transformation of Wikipedia’s data was the way that the free encyclopedia was used for search engine tags (Langlois & Elmer, Citation2009, p. 775), which in turn enabled “some of [Google’s] lower-level artificial intelligence systems,” aid with its rank search, and had been repurposed as Google’s “knowledge cards” that are presented at the top of search results (McMahon, Johnson & Hecht, Citation2017, p. 2). These tactics were a result of the creation of Wikidata, a Wikimedia project partially funded by Google to translate information from Wikipedia articles into a semantic database of metadata (Perez, Citation2012).

When McMahon et al. (Citation2017) analyzed the impact of Google’s use of Wikidata on Wikipedian traffic, they found that Google’s knowledge cards served the needs of users searching for information, but they did so at the expense of expanding Wikipedia’s own content. Since Wikipedia-based knowledge cards did not provide the ability to edit the content directly, the researchers concluded that there is less of a chance for searchers to become Wikipedian editors—and therefore this exacerbated lower levels of contribution (p. 2). Furthermore, Heather Ford (Citation2022) argued that this process of translating facts into data also leads to "diminished opportunities for debate and contestation” and the public nature of these extracted facts are further subjected to the rules of the proprietary platform. While this problem is identified, Wikipedia’s license allows this situation to persist—there are no provisions to limit commercial use of its content. In what can be seen as a means of recouping the vampiric loss in traffic to Wikipedia (and therefore potential editors), Wikipedia’s parent foundation announced the creation of Wikimedia Enterprise, a paid service for corporations like Alphabet and Amazon that allowed them to pay for custom high-volume access to Wikimedia data (Wikimedia Foundation, Citation2021). This situation points to a different moment in the imaginiare of technical development. Here, Wikipedia’s data exists as a boundary object that is used to mask a capitalist ideology. In applying Flichy’s theory, the digital labour of Wikipedians was “readily concealed in order to promote the new technique” (Flichy, Citation2007, p. 11).

Beyond Google, there exist other opaque Wikipedias which are encountered as the voice of Amazon’s Alexa and Apple’s Siri which rely heavily on an assortment of accessible content, including Wikipedia, to dramatically increase the value and utility of these consumer products (Withers, Citation2018). This use of Wikipedia’s data draws heavily from the computer science extractivist vision of the encyclopedia. As Hill and Shaw noted, “[p]erhaps the most widespread and pervasive form of Wikipedia research is not ‘about’ Wikipedia at all, but research that uses Wikipedia as a convenient dataset to study something else” (2019, p. 3). They identified that a large portion of these studies engage in Natural Language Processing of Wikipedia because “it encompasses an enormous, multilingual dataset written and categorized by humans” (p. 4).

This type of research has also found its way into some of the latest developments of attempting to create software that are trained by statistically analyzing the probability of any word following a first word from a massive dataset, identifying its patterns, and then generate texts that produce human-like responses to prompts. This is how OpenAI (a for-profit company partnered with Microsoft), created its GPT-3 model which was trained, in part, on data extracted from Wikipedia (Floridi & Chiriatti, Citation2020). In 2020, Microsoft announced that it was exclusively licensing GPT-3 and in 2021 that they would be using it for cloud services and app development (Langston, Citation2021; Scott, Citation2020). It is this combination of computer science and corporate strategies that has most significantly reworked the meaning of Wikipedia. It is not a utopia of open participation. It is desired as a resource to be mined, exploited, and enclosed as a corporate commodity.

The aesthetic Wikipedias: project and phantasmagoric utopias (2005; 2014–)

One of the initial purposes that the cofounders of the Everipedia had in mind when they forked Wikipedia was to rebuild and improve on the 2000s era design of MediaWiki. Of course, these designers were not the first to think that one of the most popular websites on the web required a facelift. As co-founder of Wikiwand, Lior Grossman lamented that his team “found the Wikipedia interface cluttered, hard to read (large blocks of small text), hard to navigate, and lacking in terms of usability” (Shu, Citation2014). In response, they built a series of browser extensions and a web application that allowed users to display Wikipedian content with more white space and greater attention to typography. Wikimedia was also aware of these issues and worked on its own reassessment of Wikipedia’s interface during this same period. This included creating a new Visual Editor, making adaptations to suit mobile devices, as well as an addressing the site’s typographic issues (Protalinski, Citation2013; Walling, Citation2014).

However, perhaps the most extensive attempt to embody a design that beautifies Wikipedia was also developed in 2014. The app Das Referenz (later, V for Wiki) is a version of Wikipedia that is meant for designers to faun over. Indeed, the famous typeface designer Erik Spiekermann tweeted that the app was the “best typography on the small screen yet” (Spiekermann, Citation2016). While legibility and user experience are part of what designer Frank Rausch envisioned, he also sought to position Wikipedia as part of a long tradition of cultural objects—encyclopedias, of course—that “have always reflected culture and style” (Jockin, Citation2016). From this perspective, encyclopedic knowledge becomes entangled in discussions about how culture is represented through objects of design. However, V for Wiki’s app-based approach positioned the encyclopedia as a cultural object to be consumed and read. This is confirmed by the support page which explained that if users see a content error on V for Wiki, they could fix it by making an edit on “the Wikipedia website,” and the “change will be visible a few hours later” on the app (V for Wiki, Citation2021). This aspect demonstrates that the app itself did not support editing the encyclopedia directly—and therefore imagines the value of Wikipedia, not as a participatory community, but as a reading commodity. This vision of Wikipedia as the encyclopedia that anyone can read—rather than edit—goes even further, with actual attempts to recreate the aesthetic experience of a printed encyclopedia.

In 2005, a new page called Help:Printing (Wikipedia, Citation2005a) was created on Wikipedia. It serves as an early indication that there was a desire to print Wikipedia articles within the first few years of its creation. In 2006, this desire was expanded by a German press called Pediapress which began offering printed books composed of Wikipedia articles (Pediapress, Citation2006). On 13 December 2007, Wikimedia announced that they would partner with Pediapress to create on-demand printed and bound copies of Wikipedia content (Wikimedia Foundation, Citation2007). Pediapress stated that the reason for making offline versions of the online encyclopedia was to make up for the limitations of a digital encyclopedia which included the fact that “more than two thirds of the worlds [sic] population has no access to the internet” and that “when it comes to reading longer texts many [people] still prefer to read books” (Pediapress, Citation2008). By early 2009, the capability to print English Wikipedia articles was extended to the German, French, Polish, Dutch, Portuguese, Spanish, and Simple English Wikipedias (Moeller, Citation2009). In this project, the promise of universal access afforded by the internet was drastically curtailed. It became clear that while digital media increased the ability to participate in encyclopedic production, participation was still limited by the geographies where Internet infrastructure existed and was affordable.

Beyond providing access to Wikipedian content through purchasable books, the idea of a printed encyclopedia had other meanings. Since 2007, the article “Wikipedia:Size in volumes” has used the image of a human standing next to a large shelf of books to represent the amount of content that Wikipedia has produced (Wikipedia, Citation2007). With Pediapress already setup to print copies of Wikipedia articles, it attempted to make this diagram a reality. Pediapress raised funds in 2014 through the crowd-funding website Indiegogo to print a single edition of every article (Dillet, Citation2014). On its campaign page it asked, “can you imagine how large Wikipedia really is? We think that the best way to experience the size of Wikipedia is by transforming it into the physical medium of books” and then display the 1000 volumes as a public exhibition at the 2014 Wikimania conference (Pediapress, Citation2014a). The campaign closed on 11 April 2014, reaching only $12, 530 of its $50,000 goal. The project could not be completed (Pediapress, Citation2014b).

Similar to Pediapress’s Indiegogo campaign was Print Wikipedia, an art performance that played on the interactions between physical and digital objects. Like Pediapress, the interdisciplinary artist Michael Mandiberg thought that a printed version of the encyclopedia could elicit an understanding Wikipedia as a whole. He argued that the weight and space of a series of physical books was a “cognitively useful” unit to measure the experience the accumulation of Wikipedia’s knowledge (Schuessler, Citation2015). However, the economic cost of printing 7600 volumes created some unavoidable artistic constraints. Maniberg therefore turned to the potential for printing all the volumes into a way of communicating scale. He did this by automating the process of typesetting each volume and then uploaded each volume to the self-publishing platform lulu.com. The performance piece itself was composed of a few printed volumes, a live stream of the upload process, and a Twitter account that announced each successful upload (Schuessler, Citation2015). What is interesting about both Pediapress and Mandiberg’s projects was that they were designed to provide access to the impossible experience of the totality of human knowledge rather than create a useful knowledge tool. And these were not the only attempts.

Numerous artists, programmers, and users have pushed Wikipedia experiences beyond finding information. Such examples render Wikipedia in ludic and exhibition terms which represent what Flichy described as phantasmagoric utopias, an escape or the “refusal to face the technical reality” (p. 10). For example, “Hatnote” poetically converted live edits into music (LaPorte & Hashemi, Citation2013) while others have transformed Wikipedian links into a galactic ocean of nodes (Li, Citation2014; McCormick, Citation2016). Some projects have been more playful, such as creating quizzes (Baldwin, Citation2018), text adventures (Machkovech, Citation2017), races (“Wikipedia:Wikirace,” 2005b), or computer science contests to create an algorithm that compresses 100MB of the encyclopedia to less than 16MB (Hutter, Citation2017). Coming full circle, Wikipedia’s founder Jimmy Wales has joined in on having some fun with Wikipedia.

In late 2021, Wikipedia’s founder auctioned off an NFT (non-fungible token) of the first edit to the Wikipedia server (Harrison, Citation2021). He described it as an “artistic concept” intended “to invite people to think about that moment” when Wikipedia coalesced as working code. However, Wikipedians pointed out that the Christie’s auction ran counter to the ethos of the encyclopedia; NFTs are based on individual—rather than collective—ownership (Harrison, Citation2021). Whether or not Wikipedians believed Wales was right to sell the edit as an NFT was beside the point. What this controversy made clear was that there has never been one way of imagining Wikipedia. There have long been different sociotechnical projects shaped by Wikipedia’s data, engaging in uncertain contests with one another, each desiring different forms and purposes for digital knowledge.

Conclusion

What this array of meaning has presented is a unique understanding of how Wikipedia is woven into various features of digital culture. It demonstrates that while Wikipedia may exist as the current encyclopedic model, the Wikipedia imaginarie is a vast collection of projects that constantly renew the meaning of Wikipedia through contests over designing for knowledge, truth, education, and economy. It has been embedded within the developments of AI and platform profiteering; envisioned as a massive physical object, exploratory games; and has been operationalized to align with the logic of app-based commodities. At the same time, as much as these differences suggest radical change, they are also continuous with the long-standing desires for media to collapse space, generate wealth, form community, and experience the totality of human knowledge.

By shifting the historical focus from the Wikipedia we query and edit, to Wikipedia’s data that was designed to flow, I have drawn attention to contradictory positions about how knowledge, media, and capitalism have been imagined to manifest within digital culture. As I have demonstrated, a history of a platform that begins and ends with a single URL will miss out on this expanse of cultural context that not only tells us what has changed—but also what has stayed the same. I therefore encourage future research to adopt this biographical approach to examining histories of the Internet. And finally, when it comes to Wikipedia specifically, this porous history makes the mundane and commonplace action of looking up as something we should not take for granted. It makes it clear that Wikipedia is more than just the Internet’s free encyclopedia.

Disclaimer

Portions of this paper are drawn from a published dissertation and a first draft of the manuscript was presented at ICA 2022.

Conflict of interest

The author was funded by the Wikimedia Foundation for a separate research project during the writing of this article.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Steve Jankowski

Steve Jankowski is a Lecturer in New Media and Digital Culture at the University of Amsterdam. He received his PhD in Communication and Culture from York University and Toronto Metropolitan University in Canada for his dissertation on Wikipedian consensus and the political design of encyclopedic media. He also holds a MA in Communication from the University of Ottawa and an Honours Bachelors of Design from the York\Sheridan Design Program. His research examines the intersections between digital culture, design and historical imaginaries of democracy and knowledge.

References