1,406
Views
0
CrossRef citations to date
0
Altmetric
Research Article

One index, two publishers and the global research economy

ORCID Icon

ABSTRACT

The emergence of a global science system after the second world war was spurred by transformations in academic publishing and information science. Amidst Russian-American technological rivalries, funding for science expanded rapidly. Elsevier and Pergamon internationalised journal publishing, whilst tools such as the Science Citation Index changed the way research was measured and valued. This paper traces the connections between the post-war expansion of academic research, new commercial publishing models, the management of research information and Cold War geopolitics. Today, the analysis and use of research metadata continues to revolutionise science communication. The monetisation of citation data has led to the creation of rival publishing platforms and citation infrastructures. The value of this data is amplified by digitisation, computing power and financial investment. Corporate ownership and commercial competition reinforce geographical and resource inequalities in a global research economy, marginalising non-Anglophone knowledge ecosystems as well as long-established scholar-led serials and institutional journals. The immediate future for academic publishing will be shaped by a growing divide between commercial and ‘community-owned’ open science infrastructures.

Introduction

In 1986, to celebrate her husband’s 65th birthday and the 40th anniversary of Pergamon Press, Elisabeth `Betty’ Maxwell invited 400 authors and editors to contribute reflections on the founding of their journals and book series, or their association with the Press. The encomiums flowed in. More than 300 sent glowing accounts of Robert Maxwell’s support and enthusiasm for their specialist area and his unstinting backing for their new journals. Across the whole gamut of the sciences and social sciences, from the History of European Ideas to the Annals of Occupational Hygiene, a veritable ‘Who’s Who’ of eminent senior academics offered lavish praise (Maxwell, Citation1988).

Five years later, on 6 November 1991, Robert Maxwell disappeared overnight from his luxury motor yacht, the Lady Ghislaine. Speculation about the cause of death focused on the huge debts facing his media and business empire, and Maxwell’s use of the Mirror’s pension fund to shore up Mirror shares. The scandal obscured the influence of Pergamon – along with its Dutch rival Elsevier – on the creation of an international science system after the second world war.

The history of the global research economy is entangled in Cold War geopolitics. Today’s science communication system traces its roots to the new publishing models pioneered by Elsevier and Pergamon, and the streams of data generated by the emergent world of ‘information science’ in the 1950s. Entrepreneurial publishers like Elsevier’s Johannes Klautz envisioned an international journal culture (Daling, Citation2006), whilst Garfield’s innovations transformed the evaluation and impact of research. As Pooley (Citation2022, p. 40) notes, by the 1960s the emergent field of bibliometrics ‘was already enmeshed in data capitalism’ (see also Sadowski, Citation2019). Many of these developments benefitted from superpower rivalries. Maxwell skilfully played off competing allied and Soviet interests in post-war Berlin (Preston, Citation2021), whilst Garfield funded his index by selling bibliometric tools to the USSR (Aronova, Citation2021). Collectively these developments reinforced the dominance of commercial academic publishing and of European and US journals, laying down the foundations of a geographically unequal global science system.

Scholars have been slow to recognise the impact of this corporate oligopoly (Larivière et al., Citation2015), compared to the long history of critical work on the university itself. Max Weber, at a speech at Munich University in 1918, was one of the first to draw attention to the ‘capitalist university’, highlighting the ‘extraordinarily wide gulf, externally and internally, between the chief of these large, capitalist university enterprises and the full professor of the old style’ (Weber, Citation1918 [1946], p. 131). Veblen (Citation1918) was similarly acerbic in ‘The higher learning in America: a memorandum on the conduct of universities by business men’. The sociological study of ‘academic capitalism’ began in the 1990s (Hackett, Citation1990; Slaughter & Leslie, Citation1997; Slaughter & Rhoades, Citation2004), yet this work focuses on the university rather than the publishing ‘industry’ per se (e.g. Geiger, Citation2004). Fyfe et. al. (Citation2017, p. 17) suggest that there has been a ‘lack of detailed understanding among academics of the historical and economic forces at play in academic publishing’.

This paper traces the post-war emergence of a global research economy underpinned by ‘international’ journals, and the influential role of the first citation index in shaping the contemporary science communication system. At a moment of policy debate about the future role of publishers within this knowledge ecosystem, the paper explores alternative publishing models, and the challenge of building open source data infrastructures.

From Nazism to the Cold War

Until the late nineteenth century, academic publishing was primarily viewed as a service provided by university presses and scholarly societies to their members (Fyfe et al., Citation2017). In Victorian England, several popular science serials developed large readerships (Brock, Citation1980), but Nature was relatively unique in garnering both scholarly credibility and commercial success (Baldwin, Citation2015). The research economy first emerged in Germany, with the rise of the research university and the emergence of commercial publishers attuned to emerging research specialisms. Daling (Citation2006) describes how Ferdinand Springer, taking on the family business in 1906, set out a vision for six different types of scientific communication, from the research journal to monographs and encyclopaedias, all of which his company sought to commission and publish. By the end of the 1920s, Springer Verlag had developed an extensive network of editors, authors and readers. The model was highly successful, and many German scientific societies began outsourcing journal production and distribution: especially as German hyper-inflation undermined society finances. By the 1930s, a series of consolidations left just two dominant German publishers: Springer and Akademische Verlagsgesellschaft (Daling, Citation2006; Edelman, Citation2004).

During the 1930s the rise of Nazism turned the tables once again, undermining Germany’s dominance of international science. Springer struggled to survive, as did other publishers classified as ‘Jewish’. Many German scholars and journal editors left Germany, some bound for Amsterdam en route to the US. Seizing the opportunities offered by this inflow of émigré expertise, the Dutch publisher Elsevier recruited German-speaking editors and technical experts. Until this point it had been a general publisher, but under its new director Johannes Klautz, appointed in 1931, it began to publish texts on technology and medicine in German. Klautz’s global ambitions led him to acquire the English translation rights to two important German Organic Chemistry handbooks, and Elsevier began to specialise in publishing chemistry texts in English. After the second world war, Elsevier moved quickly to revive international scientific collaboration, again benefitting from the tough restrictions placed on German publishers (Brown, Citation1947). Responding to calls for avowedly ‘international’ science journals that would bring fragmented research communities together, Elsevier launched Biochimica et Biophysica Acta (BBA) in 1948, with an editor determined to revitalise the fortunes of European Biochemistry (Daling, Citation2023). Other journals quickly followed. As Daling notes, these ‘open-end subscription journals were international, specialized (but still with a wide scope), fast, and free of charge for authors’ (Citation2023, p. 8). Based on a subscription business model, rather than page charges, editors experimented with a range of genres and editorial styles, and journal formats only began to standardise in the 1960s.

Across the Atlantic, US policymakers were keenly aware that technological advances had been key to wartime military success. Asked by Franklin D. Roosevelt to develop a vision for US science policy after the war, the American inventor Vannevar Bush wrote ‘Science, the endless frontier (Citation1944). He presented basic research as the ‘pacemaker’ that underpinned scientific progress, making the case for a massive expansion in US science funding and the launch of the National Science Foundation.

Like Elsevier’s Johannes Klautz, Robert Maxwell made the most of business opportunities opened up by the Cold War. Born to a poor family in Eastern Czechoslavia, he escaped the Nazi occupation and joined the Czechoslovak Army in exile during World War II. He won a military cross for active service in the British Army, and for the rest of his life continued to use his military title: ‘Captain’. He was based in Berlin after the war as a British military attaché, where he could deploy his linguistic, diplomatic and entrepreneurial skills. He later was revealed to have worked as an MI6 agent, gathering intelligence on Russia (Preston, Citation2021). As the war came to an end, the allied powers were keen to profit from German scientific knowledge. Maxwell used his military contacts to obtain copies of secret Soviet documents about every important German industrial plant. This helped the allies pre-empt a Soviet plan to strip scientific materials and remove them to the Soviet Union. During this time, Maxwell used his business and government contacts to help Springer restart journal and book distribution, doing deals and learning about the publishing industry.

Making the most of his Berlin contacts, and with secret support from the MI6, in 1951 Maxwell paid £13,000 to buy the UK distribution rights for Springer Verlag publications: six science journals and two textbook series. When Springer later withdrew, Maxwell went into direct competition with Elsevier and Springer (Cox, Citation2002). By 1960 his new company, Pergamon Press, was distributing 59 ‘international’ scientific journals, growing subscriptions at 5–10% each year. Maxwell sought out influential scientists, especially those developing new subfields or doing interdisciplinary work, and encouraged them to launch new journals. He offered them full academic autonomy as journal editors, promising rapid publication cycles and global marketing. Journals were given several years to break even, subsidised by existing serials, as well as by a rapidly growing suite of textbooks and global encyclopaedias. Maxwell used this business model to rapidly expand Pergamon’s stable of journals. Maxwell was proud of his relationships with his editors, offering them favourable contracts, generous travel budgets and editorial honoraria, along with lavish parties at Headington Hill Hall, where Maxwell based his companies (Stevenson, Citation2009). According to one colleague, Maxwell was smart because ‘he knew just what to offer to buy a person – fame or money’ (Preston, Citation2021, p. 21). In its early years, Pergamon also benefited from Cold War rivalry, landing a lucrative US state department contract to translate huge numbers of Russian scientific papers, and its profits supported his sprawling business empire.

Maxwell’s tactics were opportunistic and unorthodox, making the most of his diverse networks and reputation. Famous for his narcissistic – and highly controlling – business style, his support for his editors paid off: most remained deeply loyal (Haines, Citation1988). Pergamon began partnering with scientific societies, persuading them to outsource publishing and distribution, and in some cases acquiring rights. In return Pergamon offered societies a share of the profits, allowing them to provide services for their members. Existing journals were encouraged to ‘internationalise’, relaunching with new editorial boards and new titles. Maxwell famously claimed that there were almost infinite opportunities for journals to flourish.

The 1950s and 1960s were a time of scientific optimism and generous institutional funding. English now completely replaced German as the international language of science (Gordin, Citation2015), and a growing raft of science journals supporting international research communities (Meadows, Citation1980). Elsevier continued to expand its journal publishing, prioritising the high profit margins and financial stability that came from repeat library subscriptions (Daling, Citation2006). Pergamon also grew rapidly. By the time Maxwell sold out to Elsevier in 1991 for £440 million, as debts mounted across the rest of his empire, Pergamon had published 7,000 monographs and launched 700 journals, of which more than 400 were still active. Cox (Citation2002, p. 274), a one-time employee, argues that Maxwell had a ‘profound effect on scientific publishing’, which the debacle of his death, his debts and his misuse of the Mirror’s pension funds ‘eclipsed from history’ (Cox, Citation2002, p. 274). Whilst the ‘big man’ explanation has its appeals, the success of Elsevier and Pergamon during this period was largely thanks to Cold-War investments in science and technology.

The invention of the citation index

Across the twentieth century, commercial publishers have repeatedly redefined scientific practice. So too have the data technologies used to manage this growing flow of information. Eugene Garfield played a central role championing the importance of information science to research, a field later christened as ‘scientometrics’ by the Russian mathematician Vassily Nalimov.

Born in the Bronx to second-generation Lithuanian Jewish immigrants, the young Garfield was inspired by H.G. Wells and his vision of a ‘World Brain’. For his doctoral degree in Chemistry and Library Science, Garfield developed an algorithm for converting chemical nomenclature into formulas. Frustrated at the conservatism of traditional abstracting services, he wanted to make research knowledge accessible. Garfield (Citation1979) felt that research funding was not being matched by financing for research communication, and that new technologies of data management could help create ‘efficient’ information systems.

Garfield was not the first to grapple with the growing flow of scientific information. Books of chemical article abstracts were already being published in the late nineteenth century, as German companies launched ‘review’ and ‘progress’ journals (Daling, Citation2006). The American Chemical Society began their own abstracting service in 1907, promoting the visibility of American chemical research in the face of German dominance. Between the wars, this emergent field of ‘information science’ was championed by corporate librarians, as businesses such as Rowntrees and ICI (Imperial Chemical Industries) invested in centralised research libraries (Black & Gabb, Citation2016).

Garfield’s first commercial product was refreshingly low-tech. Realising how hard it was for librarians to keep abreast of new research, he started sending out a weekly pamphlet, photocopying the contents pages of 150 life-science journals. Printed on cheap airmail paper, it became essential reading, sparing librarians and others from having to browse through individual journals. Current Contents, as it was known, started in the life sciences in 1958, but by 1967 there were six different editions, covering 1,500 journals in physics, chemistry and the life sciences.

Setting up this Institute for Scientific Information (ISI) in 1955, and naming it after the influential Moscow research institute set up to centralise information processing (Markusova, Citation2012), Garfield began providing a range of reprinting and alerting services to his clients. Corporate subscriptions, especially from the major pharmaceutical companies, helped ISI to expand. By the late 1970s, Current Contents was indexing more than 4,500 journals, with more lobbying to be included. Each issue included a personal essay by Garfield, some promoting scientometric principles, whilst others offered whimsical reflections on the scientific life (Baykoucheva, Citation2019).

Garfield’s profile meant that he was soon in Maxwell’s sights. Garfield was working for Biological Abstracts when Maxwell tried to take it over. After threatening Garfield with legal action for copyright infringement for including his journals in Current Contents, Maxwell tried to recruit Garfield to be his director of research. Subsequently he tried to buy shares in ISI. Garfield later described him as both an adversary and a rogue (Garfield, Citation1997).

Garfield’s most influential idea was deceptively simple: an index of scholarly citations. He was determined to find a robust way to assess the utility and quality of research, and worried about the citation of ‘fraudulent, incomplete or obsolete’ data. He got the idea from a US legal research tool called Shepard’s citations, first launched in 1873, that allowed lawyers to research case law and track precedent. Wouters (Citation1999) offers a richly detailed historical account of Garfield’s exhaustive attempts to convince funders and fellow scientists of the importance of ‘shepardising’ science. Garfield felt researchers needed to understand the ‘transmission of ideas’ through citations and the intellectual structure of thought. Because the total number of citations to an article could be counted, scientists could measure the ‘impact’, and hence importance, of published work. So was launched scientometrics, the science of measuring and tracking the circulation and citation of scholarly knowledge (Garfield, Citation1955). After a long struggle to attract research funding, in 1959 the US Air Force funded a five-year trial of a prototype (Aronova, Citation2021), later followed on by NSF (National Science Foundation) funding.

Garfield recognised the logistical impossibility of a comprehensive citation index covering all scientific journals. He turned to Bradford’s law of scattering, named after the British librarian Samuel Bradford (Vickery, Citation1948). Bradford observed that there were exponentially diminishing returns that came from journal searching, and that the most important literature in any scientific field was published only in a narrow group of ‘core’ journals. Garfield decided to develop an index of only the most ‘significant’ journals, claiming that 75% of references in the life sciences were to fewer than 1,000 ‘core’ journals, and 84% were to just 2,000 journals (Garfield, Citation1955, Citation1979). It was also a commercial decision, given the costs of indexing with only basic computing facilities.

After trialling the idea with three genetics journals, the first prototype index, published in 1963, assembled citation data from 560 scientific journals, with 70% published from the US or UK, and nearly all the rest from Europe. It was a US-centred representation of the journal landscape, based on Garfield’s existing knowledge and on Current Contents, which had evolved partly in response to commercial subscriptions. ISI released the first full edition of the Science Citation Index (SCI) in 1965. It too prioritised anglophone and US-based science journals. The academic geography of a Euro-American publishing economy was hard-wired into the index from the very start. Two Chinese journals were included, but none from Africa.

The index’s rapid growth paralleled that of Current Contents. In 1966 the SCI included more than 1,150 journals, and by 1968 covered 2,000 journals. Gradually more non-European journals were indexed, but these remained a small proportion of the whole, given the parallel growth in the number of US and European serials. Garfield used the latest computer technology to speed up indexing, employing 100 data operators to add data to a central mainframe via desk tapes. Working two shifts five days a week, they were able to process 25,000 references a day (Garfield, Citation1979). Like Maxwell, Garfield benefited from Cold War tensions and his Russian contacts. Inspired by the Soviet vision of centralised data management, he built his relationship with Russian science administrators and the scientometrician Vasily Nalimov. Garfield worked hard at promoting sales of the SCI, and helped broker a deal selling IBM computers to the USSR, including a 10-year subscription to SCI services in contract (Aronova, Citation2021).

Whilst Garfield’s original aim may have been to facilitate information searching, the index also began to define ‘reputable’ academic knowledge. Inclusion mattered for journals, and publishers were prepared to pay the hefty subscription fees charged by ISI. With ever more ‘international’ journals being launched by Pergamon Press and Elsevier, the index began to take on a gatekeeping role. In the subsequent two decades it doubled in size, and by 1990 was indexing around 4,000 journals.

Many commentators were initially critical. Some mocked the idea that objectivity could be achieved by ‘not reading the literature’ (Oliver, Citation1970). Sociologists and science scholars questioned claims about the index’s global coverage (Frame et al., Citation1977; Narin, Citation1976; Rabkin & Inhaber, Citation1979), and the meaningfulness of the data for different disciplines and regions (Cole & Cole, Citation1971, Rabkin et al., Citation1979). Acknowledging the challenges (Garfield, Citation1983), a critique of SCI’s systematic discrimination against third world journals (Gibbs, Citation1995) led to a strong riposte (Garfield, Citation1997). Others pointed to its US bias (Luwel, Citation1999).

Should Garfield have foreseen that universities, academics and publishers would use the index to compete? SCI citation data allowed users to score and rank journals – and researchers – based on their citation ‘impact factor’ (Garfield, Citation1972). Unwittingly or not, Garfield had created the tools for academic and institutional game-playing. The shift was from maps to counts, from ‘descriptive to evaluative’ (Biagioli, Citation2018, p. 250). Csiszar (Citation2020, p. 51) describes how Robert Merton publicly warned Garfield about the consequences of the impact factor, saying that ‘whenever an indicator comes to be used in the reward system of an organization or institutional domain, there develop tendencies to manipulate the indicator so that it no longer indicates what it once did’. This later became known as Goodhart’s law. Later in life, Garfield bemoaned the way that the scientometric ‘tail’ wagged the information retrieval dog, eventually describing the former as a ‘monster’ (Pendlebury, Citation2021). As Pardo-Guerra (Citation2022) aptly puts it, we have, over time, all become ‘quantified scholars’.

Initially Garfield’s science index attracted little commercial interest, and income from Current Contents kept ISI afloat. One obituarist described Garfield as ‘visionary’ rather than ‘book-keeper’ (Wouters, Citation2017). His ambitions for the index – a parallel Social Sciences Citation Index (SSCI) was launched in 1973, and the Arts and Humanities Citation Index (AHCI) in 1978 – proved a continuing drain on ISI resources. In 1988 the business was bought by another publisher for $24 million. The indexes were then fully digitised and sold on to Thompson in 1991 for $210 million. Digitisation made the indexes more useful for evaluation purposes, and as countries such as New Zealand and the UK began to carry out regular research assessment exercises, they drew on citation data, even as its validity was questioned, given very different disciplinary citation practices, levels of journal coverage and publication cultures.

The commercial potential of the index became more visible when the first global rankings of universities were launched in the early 2000s, with the ranking algorithms partly based on citation data. In 2011 Thompson sold the business for $3.5 billion to Clarivate. Today, publishers such as Elsevier are rebranding themselves as data-analytics companies, capitalising on the data they hold about research and researchers.

Understanding the continuing power of citation indexes

Concern about the academy’s reliance on these commercial infrastructures has spread across the science system. Neff (Citation2020, p. 39) points out that the consequences of ‘building scientific self-governance around publication statistics’ include guaranteeing ‘the publication industry a supply of government-subsidized content, free labor for assuring quality through peer review, and a virtually certain demand that our host institutions will purchase those products back’. Highly critical of Elsevier’s reinvention as a data-analytics company, Pooley (Citation2022) defines ‘surveillance publishing’ as ‘a business model that derives a substantial proportion of its revenue from prediction products, fuelled by data extracted from researcher behaviour’. Mirowski goes further, seeing the Open Access movement as a move to ‘re-engineer science along the lines of platform capitalism’ (Mirowski, Citation2018, p. 173). Commercial academic publishing may be more than a century old (Daling, Citation2006), but Clarivate and Elsevier have become profitable global businesses on the back of the data flows generated by these indexes, with profit margins of around 40%.

Clarivate’s Web of Science ‘core collection’ now covers more than 21,000 journals within its four different indexes (including the original SCI), and in 2023 also purchased the Proquest thesis database. Elsevier, still the largest of the commercial publishing houses, published 500,000 articles in 2,800 journals in 2022, a 70% increase in article volume in a decade (Hanson et al., Citation2023). Seeking to challenge the SCI’s monopoly, Elsevier launched a rival index, Scopus, in 2004. Scopus indexes around 20% more journals and has a more international profile than Web of Science, and Elsevier actively markets its journal packages, research tools and associated consultancy services to universities globally.

Under pressure to assure the quality of indexed journals, both products have ever more exacting evaluation and selection policies. They continue to index only a small proportion of all active academic journals. Web of Science,Footnote1 the more selective of the two indexes, carries out ‘in-house’ evaluations. Candidate journals must first comply with a minimum set of quality standards. Twenty-four quality criteria include ‘adhering to community standards’, a distributed set of authors, the composition of editorial boards, and ‘appropriate citations to the literature’. They are also assessed on content significance and three citation-based metrics: including analysis of author citations, editorial board citations, and comparative citation data. New journals are first accepted into the ESCI (Emerging Science Citation Index), and promotion to the core indexes – SCIE, SSCI and AHCI – is based on impact data. Journals are regularly demoted or removed from the indexes if they are not meeting these metrics targets: 83 journals were delisted in 2023.

Scopus journal selection is overseen by a group of Elsevier-appointed external experts called the Content Selection and Advisory Board (CSAB).Footnote2 Seventeen Subject Chairs, representing different scientific fields, review all applications for Scopus indexing. Aware of its limited coverage of non-Anglophone journals, in 2019 Scopus created four local Expert Content Selection and Advisory Committees (ECSAC) in Russia, Thailand, South Korea and China. Their task is to seek out ‘titles published primarily for a local audience but deserving of international attention’.

The minimum criteria set for inclusion in Scopus include robust peer-review processes, journal registration, statements on publication ethics, and the requirement to ‘have content that is relevant for and readable by an international audience’, meaning English language abstracts and titles. Candidate journals are scored on their journal policy, journal content, journal standing, publishing regularity and online availability. Journal standing is assessed by the ‘citedness of journal articles in Scopus’, whilst journal policy includes measures of the ‘diversity in geographical distribution’ of editors and authors. Scopus also uses citation-based peer benchmarks to adjudicate inclusion decisions. Self-citation (greater than 200% higher than the field average) or citation rates of less than 50% than the field average are flagged as concerns. Such benchmarks, along with a range of AI tools, are used to regularly delist journals: 50 were removed from Scopus in May 2023 alone.

An unequal citation economy?

The unequal geographical representation of scholarly journals in Web of Science and Scopus undermines academic journals across the global south. These metrics discriminate against small journals, those published in languages other than English, and those supporting national and regional scholarly ecosystems. Asubiaro et al (Citation2024) analyse these regional disparities, highlighting how journals from Sub-Saharan Africa were four times less likely to be indexed than those from Europe.

Asubiaro and Onaolapo (Citation2023) use Ulrichsweb (the most comprehensive and inclusive global journal database) and AJOL (African Journals Online) data to estimate that in 2022 there were around 2,200 active academic journals being published across Sub-Saharan Africa. Of these only 166 were indexed in Web of Science (and 174 in Scopus), with around 75% published from South Africa. Scopus only indexed around 50 journals from across the rest of Sub-Saharan Africa. For example 21 Nigerian published journals were indexed, along with four from Ghana, and five each from Ethiopia and Kenya. Very few journals from Francophone Africa were indexed. Most Africa-published academic journals remain invisible in these indexes.

According to UlrichsWeb, there are now more than 100,000 academic journals published worldwide. Back in 1961 the mathematician Derek de Solla Price predicted that science would continue to grow exponentially, and that by 2000 there would be one million journals (Price, Citation1961). Whilst Price was right about constant growth – currently 5.4% each year, according to Bornmann et al. (Citation2021) – he could not have foreseen the diversification of these communication channels, along with the creation of mega-journals, pre-prints, academia.edu and institutional repositories.

Four multinational corporations now dominate the academic publishing landscape, each publishing more than 2,000 journals each – Springer Nature, Elsevier, Wiley-Blackwell, and Taylor and Francis (STM, Citation2021). They are based respectively in London, Amsterdam, Hoboken (New Jersey) and Oxford. Together, they publish more than 70% of all social science journals, and 50% of journals in the natural sciences. Sage publishes more than 900 journals. Measured by volume of articles, MDPI and Frontiers are almost as large, with growth underpinned by special issues: a model that is commercially profitable but poses major quality challenges of overseeing the work of guest editors.

The global science system has become a citation economy, with academic credibility mediated by the currency produced by Scopus and Web of Science. The reach of these citation indexes and their data analytics is amplified by digitisation, computing power and financial investment. Non-Anglophone journals are disproportionately excluded from these indexes, reinforcing the stratification of academic credibility geographies and endangering regional knowledge ecosystems. Researchers across the majority world are left marginalised and excluded. Some resort to a productivist logic, keeping up by publishing ever more. The result is an integrity-technology ‘arms race’. Amidst anxieties about a supposed epidemic of scientific fraud, publishers and indexes are turning to AI to deal with academic ‘gaming’ and manipulation (Biagioli et al., Citation2020).

In this expanding global research economy, new market opportunities constantly emerge. For example, Hindawi was founded in Egypt in 1997 and became an innovative publisher of 230 Open Access journals. It later moved to London before being bought by Wiley in 2021 for $300 million. Working from bases in Switzerland and China, both MDPI (launched in 1996) and Frontiers (launched in 2007) have also expanded rapidly, offering rapid ‘gold’ Open Access publishing opportunities. They champion efficient ‘customer’ service and aim to review and publish accepted submissions within a few weeks. They require accepted authors to pay article publishing charges (APCs), unless they qualify for, or are granted, waivers on grounds of geography, career stage or institutional affiliation. MDPI charges an average APC of £1,900, but, for now, most of its journals (especially in the social sciences) waive 70–100% of these fees. Frontiers – partly owned by the major shareholder in Springer Nature – charges APCs between £1,000 and £2,500, depending on the funding available in the field. In 2022, Frontiers published 125,000 articles in its 140 journals, and was ranked the third most cited publisher, whilst MDPI published 295,000 articles, a figure that doubled in two years.Footnote3 Both are fully open access, and make extensive use of special issues, with MDPI publishing 17,777 such issues in 2022 (Crosetto, Citation2021; Grove, Citation2023). MDPI’s article output is still less than the 500,000 articles published by Elsevier in 2022.

Meanwhile, elite journal ‘brands’ have become profitable marketing tools for their commercial owners. Where there was once one Lancet, there are now 22 Lancet-branded journals. Springer-Nature’s ‘brand expansion’ strategy has meant there are now more than 30 journals within its portfolio, all with Nature in their title. Nature publishes the very strongest submissions it receives, ‘cascading’ rejected articles to other Nature-branded journals, including to Open Access journals with high publication fees. Whilst Nature has an 8% acceptance rate, and Nature Research a 10% acceptance rate, Nature Communications has a 20% acceptance rate (and a $5,400 APC), and Scientific Reports, Nature’s Open Access mega-journal, has a 60% acceptance rate. Springer-Nature has also created its own journal ranking index, and publish an increasing number of branded supplements, special sections and ‘advertorials’ (Khelfaoui and Gingras, Citation2022). As their journal impact factors increase, long-established professional society journals have seen their submissions and status decline.

Meanwhile, a range of not-for-profit publishing models continue to flourish. Thirty thousand journals now use freely available Open Journal Systems (OJS) software, but only 4% are included in the Web of Science (Khanna et al., Citation2022). Across the majority world, journals excluded from Scopus and Web of Science face constant questions about their legitimacy and reputation.

In an unequal global research system, acceleration becomes a survival strategy. Universities continue to rely on national and international rankings as they compete for students, funding and reputation. Many incentivise their staff to increase their publication ‘productivity’ through financial payments and promises of promotion. Commercial publishers support this growth by publishing more articles more quickly, soliciting more special issues, and launching more journals. This accelerates the research publication cycle. Reputational stratification means that those at the peripheries (especially precarious junior and adjunct staff) are forced to publish more and faster, putting yet more pressure on the system.

In this context, some academics take short-cuts to survive. In many countries, researchers – and in some cases doctors – need to publish in indexed journals to get promoted. Without previous research experience, the chances of publishing in SCI journals are slim. One extreme option is to purchase authorship via online brokers. Science sleuths, posting to the PubPeer website, and investigative media watchdogs such as Retraction Watch, uncover problematic cases, including special issues whose contents are out of scope, plagiarised or just plain nonsense (Marcus & Oransky, Citation2014).

Academic credibility and reputation have become precious commercial assets. All fear the financial consequences of an integrity crisis. When, in 2023, Wiley was forced to retract 500 articles published in special issues from journals in its Hindawi imprint, its share price and financial position took a major hit. Wiley has since ‘sunsetted’ the Hindawi brand and continues to face tough questions about its portfolio (Retraction Watch, Citation2023). It is little wonder that publishers seek to turn the blame on ‘predatory publishers’ or amplify media caricatures of Chinese ‘paper mills’. Fraud and malpractice get portrayed as an existential threat to the integrity of science, rather than the inevitable consequence of unequal resourcing and distorted reward structures.

Is open science the answer or still in question?

The modern Open Science movement began in the early years of the internet. Initiatives such as Project Gutenberg made many books available online, whilst a number of publishers launched free-to-read digital journals. A landmark 2001 Budapest conference organised by the Open Society Foundation set out a vision of ‘Open’ journals that would make no charge for access. This Budapest Open Access Initiative led to the European-led Plan S requiring researchers to publish their work in Open Access repositories and journals.

Responding to Plan S, commercial publishers developed what they called ‘transformative’ ‘read and publish’ agreements, signing ‘big deals’ with governments, whilst also profiting from ‘Gold’ Article Processing charges (APCs). Far from transforming the publishing ecosystem, this commoditised model of Open Science strengthened the position of commercial publishers, reinforcing fears of academic ‘platform capitalism’ (Knöchelmann, Citation2021; Meagher, Citation2021; Mirowski, Citation2018). The ‘pay to publish’ requirement marginalised under-resourced early career researchers, those without large grants, and many in the majority world.

At the time of writing, the future of Open Science remains contested. The UNESCO (Citation2021) Open Science recommendation envisions research infrastructures that are ‘organized and financed upon an essentially not-for-profit and long-term vision, that enhance open science practices and guarantee permanent and unrestricted access to all, to the largest extent possible’ (UNESCO, Citation2021). In May 2023, the European Council recommended that European member states ‘step up support’ for the development of a not-for-profit publishing platform free to both authors and readers (so-called Diamond Open Access). Examples include the ‘subscribe to open’ model, where library subscriptions continue to financially support ‘diamond’ journals. Peer review and editorial quality control would continue as before.

A series of Horizon Europe projects, including DIAMAS (Developing Institutional Open Access Publishing Models to Advance Scholarly Communication) and OPERAS (Open Scholarly Communication in the European Research Area for Social Sciences and Humanities), are aiming to build a community-owned scholarly communication system, with the infrastructures and operating standards for ‘diamond’ Open Access scholarly journals (Mounier & Aspaas, Citation2023). The funding needed for such an Open Source publishing infrastructure would need sustained political will and deep pockets, given that Elsevier alone spends billions each year on technology development. The 2024 Barcelona Declaration on Open Research Information takes this vision one step further, calling for the information used by commercial infrastructures (such as Scopus and Web of Science) to also be made openly available.

The UNESCO and European Council vision is one of ‘bibliodiversity’ and a pluralist knowledge ecosystem (Berger, Citation2021). As Shearer et al. (Citation2020, p. 1) note, this ‘diversity in services and platforms, funding mechanisms, and evaluation measures will allow the scholarly communication system to accommodate the different workflows, languages, publication outputs, and research topics that support the needs and epistemic pluralism of different research communities’. Latin America is one exemplar: a strong regional Portuguese and Spanish-publishing research ecosystem supported by the Brazil-based SciELO (Scientific Electronic Library Online) database and the Mexico-hosted Redalyc open access journal network. There are a growing number of such ‘diamond’ Open Access publishing platforms offering a vision of a more equitable research world in which, as Arturo Escobar puts it, ‘many worlds might fit’ (Escobar, Citation2020).

Conclusion: science communication after the citation economy?

Almost 120 years since Ferdinand Springer set out to commercialise scientific publishing, the sector has become a highly profitable global industry. Citation indexes and digital publishing platforms are now influential data infrastructures that underpin the research economy. ‘Surveillance publishers’ (Pooley, Citation2022) like Elsevier generate more revenue from predictive analytics (Lamdan, Citation2022) than from journal subscriptions or APCs. Scholarly reputation is now measured by journal rankings, ‘impact factors’ and ‘h-indexes’. The value of research meta-data (and data on researchers themselves) has been amplified by digitisation, computing power and financial investment. Citation metrics reinforce existing academic ‘credibility economies’ (Mills & Robinson, Citation2021), built around Euro-American publishing networks and commercial interests. The stratification of academic geographies undermines regional knowledge ecosystems and puts more pressure on those at the global margins.

Some see the exponential growth in publishing ‘outputs’, and a widening global divide, as a sign that the science communication system is in crisis (Hanson et al., Citation2023). The media focuses on scientific fraud whilst publishers pursue an integrity-technology ‘arms race’ (Griesemer, Citation2020). Is there a way out of this competitive arms-race? Diamond Open Access advocates, funders and researchers in Europe are beginning to envision a more equitable research system built around community-owned publishing infrastructures and standards (e.g. Mounier & Aspaas, Citation2023). Some suggest there are alternatives to journals as the dominant media of communication (Björn et al., Citation2023; Brembs, Citation2015). Whilst citations are today’s currency of reputational credibility, their centrality should not be taken for granted. If these initiatives are to extend beyond well-resourced European universities, governments across the world will need to adequately fund research and development, and to nurture national and regional research ecosystems. The first step on this journey is helping scholars and universities recognise the limits of citations as a measure of academic value.

Acknowledgements

This paper has benefited from conversations, writing and reflection with Toluwase Asubiaro, Abigail Branford, Natasha Robinson, Kirsten Bell, and Stephanie Kitchen. All mistakes are my own.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

David Mills

David Mills is Associate Professor in the Department of Education at the University of Oxford, and Director of the Centre for Global Higher Education. His current research focuses on the political economy of the global science communication system. His most recent book is ‘Who Counts: Ghanaian Academic Publishing and Global Science’, co-written with colleagues from the University of Ghana and Oxford, and available open access from African Minds: https://www.africanminds.co.za/who-counts/

Notes

References

  • Aronova, E. (2021). Scientometrics with and without computers: The cold war transnational journeys of the science citation index. In M. Solovey & C. Dayé (Eds.), Cold war social science: Transnational entanglements (pp. 73–98). Springer International Publishing.
  • Asubiaro, T. V., & Onaolapo, S. (2023). A comparative study of the coverage of African journals in web of science, scopus, and crossref. Journal of the Association for Information Science and Technology, 74(7), 745–758. https://doi.org/10.1002/asi.24758
  • Asubiaro, T., Sonalopo, O., & Mills, D. (2024). Regional disparities in web of science and Scopus journal coverage. Scientometrics. https://doi.org/10.1007/s11192-024-04948-x
  • Baldwin, M. C. (2015). Making nature: The history of a scientific journal. The University of Chicago Press.
  • Baykoucheva, S. (2019). Eugene Garfield’s ideas and legacy and their impact on the culture of Research. Publications, 7(2), 43. https://doi.org/10.3390/publications7020043
  • Berger, M. (2021). Bibliodiversity at the centre: Decolonizing open access development and change. Development and Change, 52(2), 383–404. https://doi.org/10.1111/dech.12634
  • Biagioli, M. (2018). Quality to impact, text to metadata: Publication and evaluation in the age of metrics. Know: A Journal on the Formation of Knowledge, 2(2), 249–274. https://doi.org/10.1086/699152
  • Biagioli, M., Lippman, A., Csiszar, A., Gingras, Y., Power, M., Wouters, P., Griesemer, J. R., Kehm, B. M., de Rijcke, S., Stöckelová, T., & Fanelli, D. (2020). Introduction: Metrics and the new ecologies of academic misconduct. In M. Biagioli & A. Lippman (Eds.), Gaming the metrics: Misconduct and manipulation in academic research. MIT Press.
  • Björn, B. Huneman, P., Schönbrodt, F., Nilsonne, G., Toma, S., Renke, S., Pandelis, P., Varvara, T., Lai, M., & Rodriguez-Cuadrado, S. (2023). Replacing academic journals. Royal Society Open Science, 10(7). https://doi.org/10.1098/rsos.230206
  • Black, A., & Gabb, H. (2016). The value proposition of the corporate library, past and present. Information & Culture, 51(2), 192–225. https://doi.org/10.7560/IC51203
  • Bornmann, L., Haunschild, R., & Mutz, R. (2021). Growth rates of modern science: A latent piecewise growth curve approach to model publication numbers from established and new literature databases. Humanities and Social Sciences Communications, 8(1), 224. https://doi.org/10.1057/s41599-021-00903-w
  • Brembs, B. (2015). What should a modern scientific infrastructure look like? The Winnower. https://doi.org/10.15200/winn.143497.72726
  • Brock, W. H. (1980). The development of commercial science publishing in Victorian Britain. In A. J. M. In (Ed.), The development of science publishing in Europe. Elsevier Science Publishers.
  • Brown, C. H. (1947). Scientific publishing in continental Europe: Notes on its war and postwar status. Science, 106(2742), 54–58. https://doi.org/10.1126/science.106.2742.54
  • Bush, V. (1944). Science, the endless frontier. National Science Foundation.
  • Cole, J., & Cole, S. (1971). Measuring the quality of sociological research: Problems in the use of the “science citation index”. The American Sociologist, 6(1), 23–29.
  • Cox, B. (2002). The Pergamon phenomenon 1951–1991: Robert Maxwell and scientific publishing. Learned Publishing, 15(4), 273–278. https://doi.org/10.1087/095315102760319233
  • Crosetto, P. (2021). Is MDPI a predatory publisher?. https://paolocrosetto.wordpress.com/2021/04/12/is-mdpi-a-predatory-publisher/
  • Csiszar, A. (2020). Gaming metrics before the game: Citation and the bureaucratic virtuoso. In M. Biagioli & A. Lippman (Eds.), Gaming the Metrics: Misconduct and manipulation in academic research. MIT Press.
  • Daling, D. (2023). “On the ruins of seriality”: The scientific journal and the nature of the scientific life. Endeavour, 47(4). https://doi.org/10.1016/j.endeavour.2023.100885
  • Daling, W. (2006). The encyclopaedia as pioneer of the journal. The early years of Elsevier’s scientific publishing company, 1936–1956. In M. T. G. E. Van Zelft, F. Glas, & J. Salman (Eds.), New perspectives in book history. Contributions from the low countries (pp. 31–48). Walburg Pers.
  • Edelman, H. (2004). Maurits Dekker and Eric Proskauer: A synergy of talent in Exile. Logos, 15, 188–190.
  • Escobar, A. (2020). Pluriversal politics: The real and the possible. Duke University Press.
  • Frame, J. D., Narin, F., & Carpenter, B. D. (1977). The distribution of world science. Social Studies of Sciences, 7(4), 501–516. https://doi.org/10.1177/030631277700700414
  • Fyfe, A., Coate, K., Curry, S., Lawson, S., Moxham, N., & Røstvik, C. M. (2017). Untangling academic publishing: A history of the relationship between commercial interests, academic prestige and the circulation of research. https://doi.org/10.5281/zenodo.546100
  • Garfield, E. (1955). Citation indexes for science: A new dimension in documentation through association of ideas. Science, 122(3159), 108. https://doi.org/10.1126/science.122.3159.108
  • Garfield, E. (1972). Citation analysis as a tool in journal evaluation. Science, 178(4060), 471–479. https://doi.org/10.1126/science.178.4060.471
  • Garfield, E. (1979). Citation indexing: Its theory and application in science, technology, and humanities. Wiley.
  • Garfield, E. (1983). Mapping science in the third world. Science and Public Policy, 10(3), 112–127. https://doi.org/10.1093/spp/10.3.112
  • Garfield, E. (1997). A statistically valid definition of bias is needed to determine whether the science citation index discriminates against third world journals. Current science, 73(8), 639–641.
  • Geiger, R. (2004). Knowledge and money: Research universities and the paradox of the marketplace. Stanford University Press.
  • Gibbs, W. W. (1995). Lost science in the third world. Scientific American, 273(2), 92–99. https://doi.org/10.1038/scientificamerican0895-92
  • Gordin, M. (2015). Scientific babel: How science was done before and after global English. University of Chicago Press.
  • Griesemer, J. (2020). Taking Goodhart’s law meta: Gaming, meta-gaming, and hacking academic performance metrics. Gaming the Metrics, 77.
  • Grove, J. (2023, March 15). Quality questions as publisher’s growth challenges big players. Times Higher Education Supplement. https://www.timeshighereducation.com/news/quality-questions-publishers-growth-challenges-big-player
  • Hackett, E. J. (1990). Science as a vocation in the 1990s - the changing organizational culture of academic science. The Journal of Higher Education, 61(3), 241–279. https://doi.org/10.1080/00221546.1990.11780710
  • Haines, J. (1988). Maxwell. Macdonald and Co.
  • Hanson, M. A., Barreiro, P. G., Crosetto, P., & Brockington, D. (2023). The strain on scientific publishing. arXiv preprint arXiv2309.15884.
  • Khanna, S., Ball, J., Alperin, J. P., & Willinsky, J. (2022). Recalibrating the scope of scholarly publishing: A modest step in a vast decolonization process. Quantitative Science Studies, 3(4), 912–930. https://doi.org/10.1162/qss_a_00228
  • Khelfaoui, M., & Gingras, Y. (2022). Expanding nature: Product line and brand extensions of a scientific journal. Learned Publishing, 35(2), 187–197.
  • Knöchelmann, M. (2021). The democratisation myth. Science and Technology Studies, 34(2), 65–89. https://doi.org/10.23987/sts.94964
  • Lamdan, S. (2022). Data cartels: The companies that control and monopolize our information. Stanford University Press.
  • Larivière, V., Haustein, S., Mongeon, P., & Glanzel, W. (2015). The oligopoly of academic publishers in the Digital Era. PLOS ONE, 10(6), e0127502. https://doi.org/10.1371/journal.pone.0127502
  • Luwel, M. (1999). Is the science citation index US-biased? Scientometrics, 46(3), 549–562. https://doi.org/10.1007/BF02459611
  • Marcus, A., & Oransky, I. (2014). What studies of retractions tell us. Journal of Microbiology and Biology Education, 15(2), 151–154. https://doi.org/10.1128/jmbe.v15i2.855
  • Markusova, V. (2012). All Russian institute for scientific and technical information (viniti) of the Russian academy of sciences. Acta Informatica Medica: AIM: Journal of the Society for Medical Informatics of Bosnia & Herzegovina: Casopis Drustva Za Medicinsku Informatiku BiH, 20(2), 113–117. https://doi.org/10.5455/aim.2012.20.113-117
  • Maxwell, E. (1988). Robert Maxwell & Pergamon press: 40 years’ service to science, technology and education. Pergamon.
  • Meadows, A. J. (1980). Development of science publishing in europe. Elsevier.
  • Meagher, K. (2021). Introduction: The politics of open access — decolonizing research or corporate capture? Development & Change, 52, 340–358. https://doi.org/10.1111/dech.12630
  • Mills, D., & Robinson, N. (2021). Democratising monograph publishing or preying on researchers? Scholarly recognition and global ‘credibility economies’. Science as Culture, 31(2), 187–211. https://doi.org/10.1080/09505431.2021.2005562
  • Mirowski, P. (2018). The future(s) of open science. Social Studies of Science, 48(2), 171–203. https://doi.org/10.1177/0306312718772086
  • Mounier, P., & Aspaas, P. (2023). DIAMAS: Supporting high quality diamond open access publishing. Open Science Talk, 48(48). https://doi.org/10.7557/19.6862
  • Narin, F. (1976). Evaluative bibliometrics: The use of publication and citation analysis in the evaluation of scientific activity. Computer Horizons, Inc.
  • Neff, M. W. (2020). How academic science gave its soul to the publishing industry. Issues in Science and Technology, 36(2), 35–43.
  • Oliver, P. T. (1970). Citation indexing for studying science. Nature, 227(5260), 870. https://doi.org/10.1038/227870b0
  • Pardo-Guerra, J. P. (2022). The quantified scholar: How research evaluations transformed the British social sciences. Columbia, Columbia University Press.
  • Pendlebury, D. A. (2021). 1.3 Eugene Garfield and the institute for scientific information. In B. Rafael (Ed.), Handbook bibliometrics (pp. 27–40). De Gruyter Saur.
  • Pooley, J. (2022). Surveillance publishing. Journal of Electronic Publishing, 25(1). https://doi.org/10.3998/jep.1874
  • Preston, J. (2021). Fall: The mystery of Robert Maxwell. Viking.
  • Price, D. J. D. S. (1961). Science since Babylon. Yale University Press.
  • Rabkin, Y. M., Eisemon, T. O., Lafitte-Houssat, J.-J., & McLean Rathgeber, E. (1979). Citation visibility of Africa’s science. Social Studies of Science, 9(4), 499–506. https://doi.org/10.1177/030631277900900406
  • Rabkin, Y. M., & Inhaber, H. (1979). Science on the periphery: A citation study of three less developed countries. Scientometrics, 1(3), 261–274. https://doi.org/10.1007/BF02016310
  • Retraction Watch. (2023, December 6). Wiley to stop using “Hindawi” name amid $18 million revenue decline. https://retractionwatch.com/2023/12/06/wiley-to-stop-using-hindawi-name-amid-18-million-revenue-decline/
  • Sadowski, J. (2019). When data is capital: Datafication, accumulation, and extraction. Big Data & Society, 6(1), 2053951718820549. https://doi.org/10.1177/2053951718820549
  • Shearer, K., Chan, L., Kuchma, I., & Mounier, P. (2020). Fostering bibliodiversity in scholarly communications: A call for action!. https://digitalcommons.unl.edu/cgi/viewcontent.cgi?article=1153&context=scholcom
  • Slaughter, S., & Leslie, L. (1997). Academic capitalism: Politics, policies, and the entrepreneurial university. Johns Hopkins University Press.
  • Slaughter, S., & Rhoades, G. (2004). Academic capitalism and the new economy: Markets, state and higher education. John Hopkins Press.
  • Stevenson, I. (2009). Robert Maxwell and the invention of modern scientific publishing. Publishing History, 65, 97–III.
  • STM. (2021). The STM report: Global research trends and transformation in Open Access.
  • UNESCO. (2021). UNESCO recommendation on Open Science.
  • Veblen, T. (1918). The higher learning in America: A memorandum on the conduct of universities by business men. B. W. Huebsch.
  • Vickery, B. C. (1948). Bradford’s law of scattering. Journal of Documentation, 4(3), 198–203. https://doi.org/10.1108/eb026133
  • Weber, M. (1918 [1946]). Science as a vocation. In H. H. Gerth & C. Wright Mills (Eds.), From Max Weber: Essays in sociology. OUP.
  • Wouters, P. (1999). The Citation Culture [ PhD Dissertation Faculty of Science]. University of Amsterdam. https://hdl.handle.net/11245/1.163066
  • Wouters, P. (2017). Eugene Garfield (1925–2017). Nature, 543(7646), 492–492. https://doi.org/10.1038/543492a