593
Views
0
CrossRef citations to date
0
Altmetric
Research Articles

Shaping scientific work in universities in Chile: exploring the role of research management instruments

Moldando o trabalho científico nas universidades do Chile: explorando o papel dos instrumentos de gestão da investigação

Dando forma al trabajo científico en las universidades de Chile: explorando el papel de los instrumentos de gestión de la investigación

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Article: 2236503 | Received 02 Aug 2022, Accepted 07 Jul 2023, Published online: 30 Nov 2023

ABSTRACT

Research management instruments (RMIs) are organizational mechanisms that shape scientific work and influence the trajectory of scientific fields within universities. This qualitative study examines 80 RMIs implemented by eight research-oriented universities in Chile between 1998 and 2021. The findings reveal that these institutions employ policies prioritizing competition as the primary means of accessing funding and opportunities, contributing to the concentration of resources among established researchers participating in international circuits. Consequently, RMIs establish hierarchies within the research community based on individual merit, disregarding the material conditions that may hinder productivity for certain actors. Furthermore, these instruments discourage participation in national and regional scientific communication networks. By highlighting the impact of RMIs, this research enhances our understanding of the organizational mechanisms that shape scientific work in Chilean universities, offering insights into the challenges and opportunities researchers face in the country’s higher education system. Future studies should explore alternative participation circuits within Chilean universities and compare experiences across Latin American regions to understand how local institutions align with global evaluation criteria.

RESUMO

Os instrumentos de gestão de pesquisa (IGPs) são mecanismos organizacionais que moldam o trabalho científico e influenciam a trajetória dos campos científicos nas universidades. Este estudo qualitativo examina 80 IGPs implementados por oito universidades de pesquisa no Chile entre 1998 e 2021. Os resultados revelam que essas instituições empregam políticas que priorizam a concorrência como o principal meio de acesso a financiamento e oportunidades, contribuindo para a concentração de recursos entre pesquisadores estabelecidos que participam de circuitos internacionais. Consequentemente, as IGPs estabelecem hierarquias dentro da comunidade de pesquisa com base no mérito individual, desconsiderando as condições materiais que podem dificultar a produtividade de determinados atores. Além disso, esses instrumentos desestimulam a participação em redes de comunicação científica nacionais e regionais. Ao destacar o impacto dos IGPs, esta pesquisa aumenta nossa compreensão dos mecanismos organizacionais que moldam o trabalho científico nas universidades chilenas, oferecendo percepções sobre os desafios e as oportunidades que os pesquisadores enfrentam no sistema de ensino superior do país. Estudos futuros devem explorar circuitos alternativos de participação nas universidades chilenas e comparar as experiências nas regiões da América Latina para entender como as instituições locais se alinham aos critérios de avaliação global.

RESUMEN

Los instrumentos de gestión de la investigación (IGI) son mecanismos organizacionales que dan forma al trabajo científico e influyen las trayectorias de campos científicos en universidades. Este estudio cualitativo examina 80 IGIs implementados en ocho universidades orientadas a la investigación en Chile entre 1998 y 2021. Los resultados revelan que estas instituciones usan políticas que priorizan la competencia entre pares como el principal medio para distribuir fondos y oportunidades de investigación, contribuyendo a la concentración de recursos entre investigadores establecidos que participan en circuitos internacionales. En consecuencia, los IGIs establecen jerarquías al interior de la comunidad de investigación basadas en el mérito individual, sin tener en cuenta las condiciones materiales que pueden obstaculizar la productividad para ciertos actores. Además, estos instrumentos desalientan la participación en redes de comunicación científica a nivel nacional y regional. Al resaltar el impacto de los IGIs, esta investigación aporta a nuestra comprensión de los mecanismos organizativos que dan forma al trabajo científico en las universidades chilenas, brindando información sobre los desafíos y oportunidades que enfrentan los investigadores en el sistema de educación superior del país. Futuros estudios debieran explorar circuitos alternativos de participación dentro de las universidades chilenas y comparar experiencias en diferentes regiones de América Latina para entender cómo las instituciones locales se alinean con los criterios de evaluación global.

1. Introduction

Research management instruments (RMIs) are organizational mechanisms that aim to align individual behaviors with institutional guidelines by stimulating and evaluating scientific research. These instruments include policies, regulations, competition rules, and manuals. Often overlooked, they constitute an essential infrastructure that significantly influences and governs scientific work within universities and research institutions. RMIs define the scope of research by determining problem areas, funding sources, and eligible researchers (Rovelli Citation2017; Mathies, Kivistö, and Birnbaum Citation2020; Liu et al. Citation2019). In this manner, they play a crucial role in shaping the field of scientific inquiry (Mathies, Kivistö, and Birnbaum Citation2020; Jiménez Citation2019). As part of academia’s practical politics (Bowker and Star Citation1999), researchers contend with the influence of classifications and standards imposed by RMIs, which can profoundly impact their lives. The study of RMIs provides insights into how recent transformations in higher education systems translate into organizational mechanisms that shape scientific work, addressing the link between macro-level changes and governance instruments (Cruz-Castro and Sanz-Menéndez Citation2018). By doing so, their study contributes to further our understanding of the dynamics of science in Latin America.

Since the 1990s, Latin America has experienced profound transformations in the organization of scientific work (Bruner, Ganga-Contreras, and Rodríguez-Ponce Citation2018; Góngora Citation2021; Rovelli Citation2017; Viales-Hurtado Citation2021). During this period, the role of the State has evolved, becoming a dynamic agent that drives research efforts and emphasizes the pivotal role of technology and innovation in fostering productive development (Viales-Hurtado Citation2021). In parallel, funding agencies, universities, and research institutions have increasingly adopted governance strategies influenced by the New Public Management approach. These strategies aim to enhance productivity and foster a service-oriented environment by implementing market-like incentives and heightened accountability (Hicks Citation2012; Bruner, Ganga-Contreras, and Rodríguez-Ponce Citation2018).

Furthermore, the diffusion of new communication technologies has facilitated the expansion of international knowledge circulation and collaborative research (Beigel Citation2014), contributing to consolidating global evaluative cultures (Reymert, Jungblut, and Borlaug Citation2021). Despite these developments, Latin America remains situated on the periphery of the global scientific system. Researchers working in these regions often face limited opportunities to contribute to mainstream international circuits (Beigel Citation2014; Beigel, Gallardo, and Bekerman Citation2018; Kreimer Citation2011; Feld and Kreimer Citation2019; Koch, Vanderstraeten, and Ayala Citation2021). With variations across disciplines and localities, their focus may be more strongly directed toward less prestigious national and regional circuits (Beigel Citation2014; Beigel, Gallardo, and Bekerman Citation2018).

In this context of profound transformations, studying RMIs provides insights into the organizational mechanisms that shape scientific work and stimulate specific forms of research in Latin America. RMIs are the mundane instruments that materialize these trends within organizations, articulating macro-level changes with institutional governance processes, often not without friction and conflict.

The higher education system in Chile serves as an intriguing case for examination. The implementation of governance strategies influenced by the NPM approach has led to significant changes in the management of scientific work, transitioning from institutional block funding to targeted, competitive funding schemes that reward institutions and research groups meeting specific governmental criteria (Araneda-Guirriman, Gairín-Sallán, and Pedraja-Rejas Citation2018). Universities in Chile have established organizational mechanisms within this public policy framework to align individual behaviors with institutional strategies, including internal research funding opportunities, career advancement criteria and procedures, and publication incentives (Araneda-Guirriman, Gairín-Sallán, and Pedraja-Rejas Citation2018; Beigel Citation2014; García de Fanelli Citation2019). Despite their significance, RMIs have only recently begun to receive attention, with a particular focus on the critical analysis of strategies based on bibliometric techniques (Sisto Citation2017; Koch, Vanderstraeten, and Ayala Citation2021). However, the comprehensive characterization and broader impact of RMIs on the organization and dynamics of scientific research in Chile remain compelling areas for further exploration.

This paper aims to highlight the role of university RMIs in shaping scientific work and influencing the trajectory of scientific fields in Chile. The study is based on a qualitative document analysis of 80 RMIs implemented across eight research-oriented universities in Chile from 1998 to 2021. The findings suggest that these institutions, through the design of their policies, promote competition as the dominant mechanism for accessing funding and opportunities, ultimately leading to the concentration of resources in the hands of already established researchers who have been able to participate in international circuits. Consequently, RMIs position researchers as members of an imagined community of peers that is hierarchically ordered according to their merit in meeting these criteria. However, these instruments inadvertently silence the material conditions of scientific work that hinder certain actors from meeting productivity criteria. Additionally, they obscure and disincentivize participation in national and regional communication circuits of science. By shedding light on the influence of RMIs on scientific practices, this research contributes to a broader understanding of the organizational mechanisms that shape scientific work in universities in Chile, providing insights into the challenges and opportunities researchers face in the country’s higher education system.

This paper consists of five sections. After this introduction, a second section provides a theoretical framework for analyzing RMIs, drawing from studies on evaluative cultures and the metricization of higher education systems. The third section discusses the methodological decisions that informed our study. The fourth section presents a qualitative analysis of RMIs used in eight Chilean universities. Finally, in the fifth section, we discuss some implications of our findings and draw some conclusions.

2. RMIs and the transformation of higher education systems

Examining university-level RMIs offers valuable insights into the relationship between macro-level changes and governance instruments (Cruz-Castro and Sanz-Menéndez Citation2018), contributing to the growing literature on evaluation technologies and their effects on research ecosystems and individual trajectories. This section explores the growing adoption of research governance strategies influenced by the New Public Management approach, which aims to enhance productivity and accountability. We delve into the limitations and challenges associated with performance-based research funding and ex-ante evaluations of projects and individuals, highlighting the importance of research on evaluative cultures and metricization processes. Furthermore, we underscore the importance of RMIs as governance mechanisms, examining the potential tensions and unintended consequences they may generate based on current research.

Funding agencies, universities, and research institutions have increasingly embraced governance strategies influenced by the New Public Management approach. These strategies aim to enhance productivity and cultivate a service-oriented environment by implementing market-like incentives and increased accountability (Hicks Citation2012; Bruner, Ganga-Contreras, and Rodríguez-Ponce Citation2018). The literature highlights two primary funding mechanisms that have replaced institutional block funding. Firstly, performance-based research funding (PBRF) allocates resources at the organizational or institutional level based on ex-post assessments of research performance (Hicks Citation2012; Zacharewicz et al. Citation2019; Abramo and D’Angelo Citation2015; Good et al. Citation2015). In Chile, since 1988, higher education funding in the public sector (CRUCH) has been weighted according to the performance of each university. Government authorities set institutional goals, and indicators measuring educational and research activity are reported to them (Araneda-Guirriman, Gairín-Sallán, and Pedraja-Rejas Citation2018; Sisto Citation2017).

Secondly, funding mechanisms that rely on ex-ante evaluations of projects and individuals may involve identifying priority areas for research and implementing technologies based on peer review (Kreimer Citation2011). Research on the first type of mechanism has shed light on the challenges of steering research communities and evaluating research impact (Spinello, Reale, and Zinilli Citation2021). Furthermore, studies on peer review have underscored its limitations. According to Guthrie, Ghiga, and Wooding (Citation2017), peer review is costly as an evaluation technology, exhibits bias against innovative research, and is a weak predictor of future performance.

Research on evaluative cultures and metricization processes offers a critical perspective for assessing the functions and impacts of evaluation technologies on research ecosystems, institutions, and individual trajectories. Lamont (Citation2009) emphasizes that evaluation technologies, particularly those based on peer review, enable negotiations regarding the definition of excellence in research fields and the allocation of prestige. Furthermore, these technologies enhance transparency, accountability, and legitimacy in decision-making processes within research and higher education systems. Consequently, evaluation technologies can articulate the operations of the system of science with political decision-making (Reinhart and Schendzielorz Citation2021). In the context of university-level RMIs, this perspective suggests that these mechanisms mediate the authority derived from the research community with organizational or hierarchical leadership (Cruz-Castro and Sanz-Menéndez Citation2018).

The term “metricization,” as defined by Burrows (Citation2012), refers to the increasing reliance on quantitative measures in academic performance evaluation systems. The metricization of universities involves the creation of surveillance and subjectivation technologies that observe and manage academic careers (Barron Citation2021; Clarke and Knights Citation2015). By using “hard” categories that purport to be objective and universal (Spence Citation2019), quantitative evaluation technologies contribute to the production of bibliographic (Lim Citation2021) or quantified identities (Fardella, Corvalán-Navia, and Zavala Citation2019). Individual academics participate in these systems by producing works that align with institutional, national, and global information infrastructures and reporting their performance using these terms (Barron Citation2021; Lim Citation2021).

Among university governance mechanisms, RMIs play a role in this context of profound transformations of higher education systems. These seemingly mundane and often overlooked instruments serve as a means for institutions to align their administrative objectives with knowledge production processes. They establish objectives for academics to achieve while regulating essential factors such as research areas (Rovelli Citation2017), preferred journals for publication (Mathies, Kivistö, and Birnbaum Citation2020), and researchers’ working locations (Liu et al. Citation2019). The application of RMIs can be seen as a guide that individuals use to align their actions with the expected outcomes outlined in the instruments, considering symbolic disputes, bibliometric goals, and immediate economic results (Mathies, Kivistö, and Birnbaum Citation2020).

Literature suggests that organizational-level research governance instruments can have unintended consequences (Rovelli Citation2017). RMIs may generate tensions with the traditions of scientific work, such as an increase and aversion to supervision processes (Fardella, Corvalán-Navia, and Zavala Citation2019; Góngora Citation2021; Jiménez Citation2019), distortions in the quality of knowledge production (Clarke and Knights Citation2015; Good et al. Citation2015; Spence Citation2019), promotion of competitive dynamics (Góngora Citation2021), questionable reporting of university placements (Broitman and Rivero Citation2022), institutional rigidity in adapting to new policies (Androgué et al. Citation2019), and loss of autonomy (Araneda-Guirriman, Gairín-Sallán, and Pedraja-Rejas Citation2018; Castillo and Watson Citation2017).

3. Materials and methods

To explore the impact of RMIs on scientific activities in Chilean universities, we conducted a qualitative documentary analysis study (Prior Citation2003) of 80 instruments used in eight research-oriented universities in the country. This study involved a systematic review and analysis, with the RMIs as the primary focus of observation. We adopted this perspective due to its comprehensive nature, emphasis on local contexts, and sensitivity to constructing textual worlds (Prior Citation2003). Treating the documents as social artifacts, we recognized their creation, consumption, sharing, and organized use within a social framework (Atkinson & Coffey Citation1997; Prior Citation2003). Consequently, we consider RMIs to be elements within an information infrastructure (Bowker and Star Citation1999; Kreimer Citation2011) capable of reflecting norms in the management of research endeavors.

The study design employed a theoretically grounded sampling approach (Patton Citation2014). For the university selection, we identified the top eight institutions in the 2021 Scimago ranking for Chilean universities. This number was chosen to encompass a significant portion of research activity in Chile, as the top ten institutions account for a substantial proportion (Mondaca et al. Citation2019). The Scimago ranking was selected as an internationally recognized standard for bibliometric measurement (Mondaca et al. Citation2019).

The timeframe for the implementation of RMIs spanned from 1998 to 2021. This period is justified by the introduction of performance-based funding in 1998 and the subsequent significant policy changes in fund allocation, which are considered crucial in shaping the design of RMIs within the NPM framework. We considered various types of RMIs for each university to encompass the range of devices employed, distinguishing between institutional policies, regulations, call guidelines, and manuals based on the labels assigned by each university. When obtaining specific label information was not feasible, we applied predefined definitions for each category. Thus, institutional policy documents were interpreted as articulating the broad norms that define the university’s research interests. Regulations referred to the regulatory codes governing the implementation of research policies. Call guidelines encompassed invitations to the academic community to participate in research projects or seek resources to support their work. Lastly, manuals pertained to technical documents outlining the use of institutional infrastructure.

The data sources were the respective Vice-Rectors for Research and Development websites in all cases. In some instances, other sources hosting academic and research career policies and regulations were also consulted. The websites were reviewed from August to September 2021, and only documents issued directly by the institutions themselves were considered, excluding any instruments from external sources. A total of 80 documents were selected, averaging approximately ten documents per university. It is important to note that for University 7 (U7), only four documents from its Vice-Rector for Research were available. However, one of these documents contained all the funding programs for research projects at the university, from which six programs were selected. summarizes the number of available RMIs per university (N) and the number selected (S). For practical purposes, an RMI that repeated year after year was considered as a single instrument. The percentage at the end row of indicates how many documents were considered in relation to the available documents of the same type.

Table 1. Number of available RMIs per university (N) and number of selected RMIs (S).

We conducted a pragmatic discourse analysis in three phases. The first phase involved the preparation and organization of the material. Texts were anonymized and labeled according to the university they belonged to. The second phase entailed an initial exploration of the data using procedures outlined by Grounded Theory (Glaser and Strauss Citation1967). Texts were coded using CAQDAS. The systematic reading of codes allowed to identify patterns and regularities, which were subsequently grouped into emergent categories. In the third phase, the categories were discussed, filtered, and reorganized based on their ability to answer the research questions (Wetherell Citation2007).

Thus, during the analysis and discussion of the documents, four key categories emerged to understand the normativity of the RMIs:

  1. Objectives: This category includes the declared objectives or functions of the instrument.

  2. Participation conditions: This category encompasses the declared criteria for including and excluding participants.

  3. Allocation criteria: This category focuses on the declared evaluative criteria for allocating the instrument.

  4. Clauses: This category comprises the declared requirements that must be fulfilled to terminate the execution of the instrument.

To ensure the coding quality and interpretive validity, we employed triangulation and member checking as our method (Maxwell Citation2005).

4. Results

In this section, we explore how RMIs shape scientific work. We begin by examining the role of university policies and their relationship with regulations and call guidelines. We then explore the coordination challenges among these instruments, illustrating how the lack of explicit coordination may reinforce the emphasis on productivity over other criteria for evaluating scientific work. Furthermore, we investigate the selection criteria for conducting research and advancing in academic hierarchies, highlighting the significance placed on research outputs and productivity indicators. Lastly, we analyze the type of research incentivized by RMIs through evaluative criteria, including novelty, impact, collaboration, knowledge dissemination, international linkage, and excellence.

4.1. How do RMIs perform scientific work?

University policy plays a central role in shaping RMIs by establishing strategic organizational decisions declaratively, without requiring further justification. For all observed cases (9 documents in total, including 2 from U3), policies dictate regulations for academic jobs by stimulating and evaluating research in competitive calls. A clear example of this trend can be seen in the research policy of U2, which states: “The University will provide economic and academic incentives to researchers for their research outputs and the generation of new knowledge (…) Internal economic resources available for research will be allocated through competitive processes based on public procedures” (U2-PI).

Research policies often incorporate regulations restricting research activities and giving rise to additional instruments designed to incentivize specific behaviors. Competition rules were the most prevalent among the procedural RMIs identified in our analysis (53 documents, 66%). This instrument stands out as it is widely employed across universities and disciplines in our sample. Money transfer competitions are the preferred approach to research management, as all the calls for proposals presented this modality. They create a funding landscape characterized by selectivity, where only the most promising proposals are chosen in a competitive procedure.

Even though within each organization, RMIs present themselves as systems or sets of interrelated instruments, there is often a lack of coordination among policies, regulations, and competition rules. An instance of coordination failure can be observed in a research policy that initially lists 13 guiding principles, including productivity and inclusion. Later in the document, principles are translated into objectives. The principle of productivity is defined as “strengthening and expanding capacities in creation, research, infrastructure, innovation, and technology transfer” (U8-PI). Surprisingly, the principle of inclusion is omitted at this stage. Consequently, when mechanisms are designed to assess compliance with the objectives, procedural RMIs fail to reference the principle of inclusion. In this case, evaluation criteria incorporate measures of productivity but not inclusion. For example, one of the call guidelines used in this university includes the following indicators:

Number of applied research, innovation, or entrepreneurship projects derived from acquired capabilities; Number of press releases or extension articles related to training activities and dissemination activities; Number of technical, continuous, undergraduate, and postgraduate training courses where acquired knowledge and capabilities are applied; Level of satisfaction of attendees in dissemination activities. (U7-IFI)

In this case, the initial mention of productivity and inclusion in the policy is ultimately assessed mainly through productivity indicators.

In a second example (U5), institutional emphasis on improving societal well-being is omitted in evaluation technologies focused on scientific novelty and project feasibility. In its research policy, U5 establishes the mission of research as follows: “The mission of research (…) is to enhance the improvement of our society’s well-being, serving as a significant lever for the development of excellent human capital and for the understanding and resolution of complex problems” (U5-RGI). However, the same orientation is not reflected in the RMIs derived from this policy. For instance, one of their competitions outlines the allocation criteria as follows: “[The Research and Doctoral Directorate] must evaluate the novelty and feasibility of the proposed research and, based on this assessment, recommend the allocation or non-allocation of funds” (U5-FIIE). In this manner, competition guidelines may deviate from the intentions expressed in research policies, in this case emphasizing scientific novelty, feasibility, and impact in academic publication circuits.

4.2. Who can do research?

A group of institutional documents explicitly state selection criteria for career advancement in academic hierarchies and competition access. Firstly, the regulations for academic careers provide instructions on how the hierarchy is structured: “The academic career governs the trajectory of professors at [University] based on the academic merits outlined in this Regulation” (U6-RCA). Secondly, access to competitions is limited by contractual requirements that identify individuals eligible to participate in the community’s calls. It is common in competitions to require as a condition the possession of a specific academic rank or a minimum number of hours worked for the university: “The competition is open to researchers who have at least a half-time contract in the Faculty of Engineering Sciences” (U2-BCAI).

RMIs establish criteria for advancing in academic careers, which are constructed in terms of individual merit. We were able to access documents in six universities regarding systems for evaluating academic careers. For all the observed cases, an academic career is structured as a hierarchy, and merit is estimated through productivity measures that become more stringent as higher ranks are considered. For instance, at U3, to move from an instructor to an assistant professor, the academic must demonstrate academic and personal qualities aligned with the university’s mission, hold a doctoral degree, work as an instructor for 400 h, exhibit a good level of teaching in undergraduate courses, have at least one publication, supervise the work of teaching assistants, collaborate in administrative tasks, and ensure a continuous contribution to the university (U3-RCA). To achieve promotion from assistant professor to associate professor, the academic is required to have demonstrated a commitment to the university in their trajectory, hold a doctoral degree, work as an assistant professor for 400 h, achieve a distinguished level among peers as a researcher through the number of publications, research projects, and dissemination of experiences, demonstrate dedication to guiding the work of teaching assistants and students, collaborate in administrative tasks, and ensure a continuous contribution to the university (U3-RCA). Then, to go from associate professor to full professor, they must “accredit a remarkable research career at an international level” (U3-RCA). In this case, criteria are accentuated, evidencing the persistence of a model that incentivizes both productivity and participation in international scientific communication circuits (Beigel Citation2014) above other forms of academic work.

Academic regulations at U1 are also illustrative of the emphasis on academic productivity, measured in terms of research projects and publications:

An objective validation of the quality and relevance of the research activities of an academic is the interest they arouse in their disciplinary community at national and international level. Therefore, both the publications generated and the funding that supports them are performance indicators. (U1-PGCA)

In all cases, evaluation criteria in the academic career focus on the merit of the evaluated individuals. Therefore, RMIs regulate scientific work by considering individual academics as units that can be ordered according to their merit, which is inferred through productivity indicators. They make up a community of peers ordered under a principle of equivalence: everyone will be rewarded according to their efforts. However, to the extent that they select particular areas of virtue-related trajectories of people, anything beyond the criterion is immediately ignored. Difficulties, exclusions, leaks, or any indication of an external world are not taken into account. Thus, the research community is presented as a fictional group where the material conditions of the exercise of scientific work are silenced under a principle of partial equivalence.

However, we have observed an emerging focus on care in some RMIs, aiming to avoid evaluation based solely on the previously described criteria. They are being implemented in five universities in our sample (six documents at U1, two at U2, five at U4, one at U6, and one at U7). These RMIs prioritize addressing individuals’ challenges in their work rather than solely focusing on individual merits. They consider relevant elements that hinder the adequate representation of civil society in academia, providing alternative avenues for promotion beyond personal virtues. Examples include the opportunity to access a decentralization bonus equivalent to an additional 10% of the approved fund (U6-CCAC), the provision of a “Childcare Support Fund” for female academics with children under 12 years old (U1-PBE), and efforts to overcome cultural and institutional barriers that impede the equal development of women and men in the university and the country (U4-PCAF).

4.3. What kind of research do RMIs incentivize?

RMIs influence the type of research encouraged in universities primarily by establishing criteria for evaluating research projects. According to our findings, the most frequently used criteria in research project competitions, in descending order, are novelty (40 documents, 75% of competitions), impact (37, 70%), collaboration (37, 70%), knowledge dissemination (36, 68%), and international linkage (33, 62%). Although less commonly used, the criterion of academic excellence (24, 45%) carries the most weight in the evaluation matrices.

Novelty: The criterion of novelty encompasses two perspectives. Firstly, it relates to the current academic standing of the principal researcher. Secondly, it pertains to the timeliness of the scientific knowledge underlying the research project. The number of publications in mainstream journals and involvement in externally funded projects are considered to assess the first aspect of novelty. For instance: “The productivity of each member of the research team must be demonstrated through indexed publications (WoS or Scopus) and their participation as Principal Investigators in externally funded projects in the last five years (e.g. Fondecyt)” (U4-BPN). Similarly, a common practice among RMIs is considering the past five years of research output. This criterion highlights active researchers who have used their time effectively, with each outcome contributing to their short evaluation cycles.

The second aspect of novelty, concerning the up-to-dateness of knowledge, is typically evaluated by expert panels. These panels consist of academic peers who possess knowledge of advancements in a specific field of study. Peer evaluation implies that the allocation criteria in competitions are determined by community members who have been validated by meeting academic career evaluation criteria.

Impact: High impact evaluates the significance of the research project’s results. To assess high impact, the evaluation primarily relies on two characteristics indicating the project’s potential. Firstly, the project’s ability to produce research papers published in reputable journals is considered. Secondly, the production of knowledge beneficial to the public good, particularly regarding technological uses, is measured through patent applications. The latter aims to contribute to developing a national productive matrix based on science, technology, knowledge, and innovation. For example, an RMI states:

The objective is the development and/or validation of technology through concept testing in the context of applied research, which, through achieving a continuity milestone, enables the development of solutions that lead to new products, services, or processes to meet a market need, generating significant economic and/or social impact. (U7-PCPI)

Collaboration: The criterion of collaboration focuses on the formation of research teams. It is common for RMIs to encourage research conducted by groups of academics to promote institutional collaboration. For instance, “Received applications will be reviewed to determine their eligibility, compliance with the present guidelines, including their interdisciplinary nature, and international and/or national collaborations” (U8-BII). According to our findings, numerous RMIs promoting collaborative research do not provide specific instructions for group composition besides forming research teams that include students, individuals from different disciplines, and foreign researchers. Thus, apart from promoting intergenerational, interdisciplinary, and international research, no other indications guarantee the heterogeneity of successful academic projects.

Knowledge dissemination: The criterion of knowledge dissemination refers to the requirements set by call guidelines for organizing events that inform the community about research outcomes. These events serve as secondary outputs of the project, but like the formation of collaborative groups, there are limited and unclear guidelines on achieving successful dissemination. For instance, in the guidelines of a funding competition that considers knowledge dissemination a significant criterion for awarding funds, the only instruction is that the responsible researcher should disseminate the proposals and their results within the departments (U6-PIIE). Similarly, none of the guidelines specify deadlines, evaluation indicators, or dedicated resources for the dissemination aspect.

International Association: The criterion of international linkage refers to evaluating and promoting research activity in conjunction with the international community’s work. For example, “This contest is designed to support doctoral and Master’s programs in bringing foreign researchers to strengthen their research lines” (U8-BVPE). Notably, this criterion was highly valued in contests for which numerous instruments are designed to establish a connection with foreign institutions. Unlike collaboration and knowledge dissemination, these instruments define a successful case and how it is evaluated. For instance, in the case of U3, one of its instruments aims to “promote and strengthen national and international academic partnerships” (U3-AAEVI). The selection process is then described as follows:

The evaluation committee will consider the following criteria to assess the contest (…): Curriculum vitae of the last 5 (five) years; Relevance of the event to be participated in and the work to be carried out; Pre and/or post-congress or exhibition arrangements. (U3-AAEVI)

According to our findings, international linkage contests were highly interesting to research activity, as dedicated instruments are designed exclusively for this purpose, with specific resource allocation and evaluation criteria to assess their success.

Excellence: According to allocation weights, excellence is the most emphasized characteristic in call guidelines. However, although it generally involves the submission of high-quality research proposals, no explicit definitions are provided for assessing excellence. Nevertheless, unlike collaboration and knowledge dissemination, its evaluation is strict and entrusted to an expert panel usually tasked with ensuring excellence. It is presented as follows: “The aforementioned evaluation committee will determine the outcome of the contest based on compliance with these guidelines and the following criteria: Quality of the proposal presented (…)” (U4-AII). According to our findings, excellence is often treated as an indescribable criterion. It represents the enclosure of a specific knowledge area and establishes a threshold that only field experts can surpass. However, in other cases, excellence is inferred from evaluations of individual academics. Call guidelines indicate that excellence can be inferred from individual productivity. For example, an RMI states: “Objective: Facilitate the integration of young academics of excellence into the [University]” (U2-BUI). Moreover, later: “Interested individuals must demonstrate high potential to achieve, in the near future, a relevant level of scientific productivity, meaning they are capable of developing an independent research line” (U2-BUI).

5. Discussion and conclusion

In this study, we have explored how Chilean universities’ RMIs shape scientific work. Previous research has noted that RMIs define research parameters by specifying problem areas, funding sources, and eligible researchers (Rovelli Citation2017; Mathies, Kivistö, and Birnbaum Citation2020; Liu et al. Citation2019). Our findings have revealed three dimensions in which these instruments shape scientific practices.

In the first dimension, which focuses on formal aspects, we have observed that these instruments shape scientific work through their discursive style and integration within the same institution. Research policies adopt a declarative style that establishes specific mechanisms as natural practices, despite being determined by organizational decisions. This style reinforces the perception that these practices are necessary. We have also noted that the way instruments are interconnected may contribute to replicating critical elements in the organization of scientific work. In this respect, we have identified instances in which institutions express interest in emphasizing the social impact of research. However, this intention is not translated into specific objectives and evaluation criteria in funding calls. As a result, egalitarian principles stated in policies can be disregarded in practice (Bhopal Citation2016), perpetuating evaluation practices that prioritize impact within scientific communication networks.

In the second dimension, RMIs exert influence by establishing standards for the scientific activity that shape individual behaviors and identities (Lounsbury Citation2001). These standards dictate how researchers should conduct their work and what alternatives are available (Busch Citation2013). We have observed that RMIs establish standards for classifying academic trajectories based on individual merit, emphasizing productivity indicators and participation in international scientific communication networks. However, these standards overlook various factors that affect academic performance, such as gender (Vohlídalová Citation2021), ethnicity (Bhopal Citation2016), socioeconomic class (Chiappa Citation2021), precarious working conditions (Pérez and Montoya Citation2018), and challenges in balancing research, management, and teaching responsibilities (Fardella-Cisternas, Espinosa-Cristia, and Garrido-Wainer Citation2023), and mental health issues (McAlpine and Amundsen Citation2015). Consequently, established researchers are more likely to receive favorable evaluations, perpetuating existing inequalities in the academic system. To echo the words of Bowker and Star (Citation1999, 34), “One person’s infrastructure may be another’s barrier.”

In the third dimension, RMIs shape scientific work by establishing criteria for its evaluation, reinforcing the dominance of international scientific communication networks. These instruments align with government mechanisms that promote engagement in international circuits, which are increasingly prioritized in Latin America (García de Fanelli Citation2019; Beigel, Gallardo, and Bekerman Citation2018). This is particularly the case of the Chilean higher education system, where the Ministry of Education uses criteria to define yearly increases in governmental funding of “public” (CRUCH) universities that prioritize publications in journals indexed by Web of Science (Ministerio de Educación Citation2023). Translating this emphasis within universities, RMIs downplay participation in national and regional science communication circuits.

Lastly, the criterion of excellence holds a significant position within RMI systems. It is the most relevant criterion for evaluating research proposals, yet explicit definitions are not provided. The application of excellence criteria consistently relies on peer review mechanisms. Previous research has highlighted that this approach legitimizes evaluative processes within scientific communities and political decision-making regarding allocating limited resources (Lamont Citation2009; Reinhart and Schendzielorz Citation2021). Not giving explicit definitions or indicators allows for negotiation within scientific communities regarding what constitutes excellence for each of them.

In conclusion, RMIs in the selected Chilean universities profoundly impact scientific work, shaping it through formal aspects, career standards, and project evaluation criteria. These findings highlight the need for a critical examination of these instruments and their potential implications for the scientific community, and the allocation of resources.

Our study has limitations that may direct to future lines of research. Our analysis was based on a subset of 80 documents from the top 8 universities in the Scimago ranking for Chile, covering the period from 1998 to 2021. This sample excludes a significant portion of the observed instruments and universities in the country, indicating that our findings may not fully represent the national landscape. Furthermore, our research design focused on universities with a stronger emphasis on publishing in mainstream journals. Future research could expand on this by investigating the participation circuits of Chilean universities that prioritize different aspects over international focus. Exploring how these universities navigate their research activities within the context of government funding requirements would provide valuable insights.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This work was supported by ANID [grant number Fondecyt 1230604; FONDECYT 11201004]; UNAB [grant number NUC DI-04-21].

Notes on contributors

David Marchant-Cavieres

David Marchant-Cavieres is a sociologist with a focus on the sociology of scientific work. He is an independent academic, actively pursuing a research agenda in this field.

Carla Fardella

Carla Fardella holds a Ph.D. in social psychology and specializes in social studies of work, higher education, and the transformation of production models. With a strong commitment to bridging research and practical applications, she has contributed to the formulation of public policies in higher education, the design of scientific career development programs, and the advancement of gender equity in education and science. Carla Fardella currently serves as a full professor at the University Andres Bello.

Fernando A. Valenzuela

Fernando A. Valenzuela is an Associate Professor at the School of Social Sciences, University Andrés Bello, Chile, where he also serves as the Director of the Sociology Program. He holds a Doctorate in Sociology from the Universität Luzern, Switzerland, and earned his bachelor's and master's degrees in Sociology from the Pontificia Universidad Católica de Chile. In his most recent projects, he has conducted research on the contemporary conditions of scientific work and explored the sociocultural dimensions of telemedicine.

Juan Felipe Espinosa-Cristia

Juan Felipe Espinosa-Cristia specializes in the field of management education, with a particular emphasis on knowledge production processes and the impact of technology on organizations and society. His research portfolio encompasses projects exploring technical mediation within banking environments and innovation management practices in both startup and established companies. Dr. Espinosa-Cristia currently holds the position of Associate Professor at the Universidad Técnica Federico Santa María, and he earned his Doctorate in Management from the University of Leicester.

Paulina E. Varas

Paulina E. Varas is a specialist in the field of art theory and has authored numerous articles on contemporary Chilean and Latin American art. She has actively engaged in artistic endeavors, participating as an exhibitor, speaker, and lecturer. Dr. Varas holds the position of Full Professor and Researcher at the University Andrés Bello, and she earned her Ph.D. in Art History and Theory from the University of Barcelona.

Claudio Broitman

Claudio Broitman's research interests encompass a wide range of areas, including environmental communication, risk assessment, knowledge production, discourse analysis, and socio-technical controversies. Dr. Broitman holds the position of Associate Professor and serves as the Director of the School of Journalism at the University Andrés Bello. He earned his Ph.D. in Sciences de l'information et de la Communication from the Université Paris-Sorbonne.

References

  • Abramo, G., and C. A. D’Angelo. 2015. “Evaluating University Research: Same Performance Indicator, Different Rankings.” Journal of Informetrics 9 (3): 514–525. https://doi.org/10.1016/j.joi.2015.04.002.
  • Androgué, C., A. García de Fanelli, M. Pita Carranza, and D. Salto. 2019. “Las Universidades Frente al Aseguramiento de la Calidad y las Políticas de Financiamiento de la Investigación: Estudios de Caso en el Sector Privado Argentino.” Revista de la Educación Superior 48 (190): 45–70. https://doi.org/10.36857/resu.2019.190.711.
  • Araneda-Guirriman, C., J. Gairín-Sallán, and L. Pedraja-Rejas. 2018. “La Autonomía en la Educación Superior: Reflexiones Desde los Actores en el Contexto del Financiamiento por Desempeño en Chile.” Formación Universitaria 11 (4): 65–74. https://doi.org/10.4067/S0718-50062018000400065.
  • Atkinson, P. A., and A. Coffey 1997. “Analysing Documentary Realities.” In Qualitative research: Theory, Method and Practice, edited by D. Silverman, 45–62. London: Sage.
  • Barron, G. R. S. 2021. “Rankings as Surveillance Assemblage.” In Global University Rankings and the Politics of Knowledge, edited by M. Stack, 172–194. Toronto: University of Toronto Press.
  • Beigel, F. 2014. “Publishing from the Periphery: Structural Heterogeneity and Segmented Circuits. The Evaluation of Scientific Publications for Tenure in Argentina’s CONICET.” Current Sociology 62 (5): 743–765. https://doi.org/10.1177/0011392114533977.
  • Beigel, F., O. Gallardo, and F. Bekerman. 2018. “Institutional Expansion and Scientific Development in the Periphery: The Structural Heterogeneity of Argentina’s Academic Field.” Minerva 56 (3): 305–331. https://doi.org/10.1007/s11024-017-9340-2.
  • Bhopal, K. 2016. The Experiences of Black and Minority Ethnic Academics: A Comparative Study of the Unequal Academy. Oxford: Routledge.
  • Bowker, G., and S. L. Star. 1999. Sorting Things Out: Classification and its Consequences. Cambridge, MA: MIT Press.
  • Broitman, C., and P. Rivero. 2022. “La movilidad científica y los significados en torno a la utilidad en la producción y circulación de conocimientos: Un análisis a partir del programa Becas Chile.” Universum (Talca) 37 (2): 457–478. http://doi.org/10.4067/s0718-23762022000200457.
  • Bruner, J., F. Ganga-Contreras, and E. Rodríguez-Ponce. 2018. “Gobernanza del Capitalismo Académico: Aproximaciones Desde Chile.” Revista Venezolana de Gerencia, Esp 1: 11–28. https://www.redalyc.org/articulo.oa?id=29062781001.
  • Burrows, R. 2012. “Living with the H-Index? Metric Assemblages in the Contemporary Academy.” The Sociological Review 60 (2): 355–372. https://doi.org/10.1111/j.1467-954X.2012.02077.x.
  • Busch, L. 2013. Standards: Recipes for Reality. Cambridge, MA: The MIT Press. https://doi.org/10.1111/j.1471-0366.2013.00378.x
  • Castillo, A., and J. Watson. 2017. “Academic Entrepreneurial Behavior: Birds of More Than One Feather.” Technovation 64-65: 50–57. https://doi.org/10.1016/j.technovation.2017.07.001.
  • Chiappa, R. 2021. “Finding Academics Jobs in Stratified Countries: The Effects of Social Class of Origin in the Development of Academic Networks for Chilean PhD.” In The Global Scholar: Implications for Postgraduate Studies and Supervision, edited by En Rule, 189–213. Western Cape: African Sun Media.
  • Clarke, C., and D. Knights. 2015. “Careering Through Academia: Securing Identities or Engaging Ethical Subjectivities?” Human Relations 68 (12): 1865–1888. https://doi.org/10.1177/0018726715570978.
  • Cruz-Castro, L., and L. Sanz-Menéndez. 2018. “Autonomy and Authority in Public Research Organisations: Structure and Funding Factors.” Minerva 56 (2): 135–160. https://doi.org/10.1007/s11024-018-9349-1.
  • Fardella, C., A. Corvalán-Navia, and R. Zavala. 2019. “El Académico Cuantificado. La Gestión Performativa a Través de los Instrumentos de Medición en la Ciencia.” Psicología, Conocimiento y Sociedad 9 (2): 62–78. https://doi.org/10.26864/pcs.v9.n2.15.
  • Fardella-Cisternas, C., J.-F. Espinosa-Cristia, and J.-M. Garrido-Wainer 2023. “Administradores de la producción científica y gobernanza académica: Análisis de un conflicto identitario.” Revista Iberoamericana De Educación Superior. 14 (41): 3–19. https://doi.org/10.22201/iisue.20072872e.2023.41.157.
  • Feld, A., and P. Kreimer. 2019. “¿Cosmopolitismo o Subordinación? La Participación de Científicos Latinoamericanos en Programas Europeos: Motivaciones y Dinámicas Analizadas Desde el Punto de Vista de los Líderes Europeos.” História, Ciências, Saúde-Manguinhos 26 (3): 779–799. https://doi.org/10.1590/S0104-59702019000300004.
  • García de Fanelli, A. 2019. “El Financiamiento de la Educación Superior en América Latina: Tendencias e Instrumentos de Financiamiento.” Propuesta Educativa 28 (2): 111–126. https://souciencia.unifesp.br/images/docs/pesquisas/Financiamiento_Educativo_AL_Revista_PE_FLACSO_AC_1.pdf.
  • Glaser, B., and A. Strauss. 1967. The Discovery of Grounded Theory: Strategies for Qualitative Research. Chicago, IL: Aldine Publishing.
  • Good, B., N. Vermeulen, B. Tiefenthaler, and E. Arnold. 2015. “Counting Quality? The Czech Performance-Based Research Funding System.” Research Evaluation 24 (2): 91–105. https://doi.org/10.1093/reseval/rvu035.
  • Góngora, E. 2021. “Financiamiento por Concurso Para Investigación Científica en México. Lógicas de Competencia y Experiencias de Científicos.” Revista Mexicana de Investigación Educativa 26 (88): 149–172. http://www.scielo.org.mx/pdf/rmie/v26n88/1405-6666-rmie-26-88-149.pdf.
  • Guthrie, S., I. Ghiga, and S. Wooding. 2017. “What Do We Know About Grant Peer Review in the Health Sciences?” F1000Research 6: 1335. https://doi.org/10.12688/f1000research.11917.1.
  • Hicks, D. 2012. “Performance-based University Research Funding Systems.” Research Policy 41 (2): 251–261. https://doi.org/10.1016/j.respol.2011.09.007.
  • Jiménez, J. 2019. “El Sistema Nacional de Investigadores en México Como Mecanismo Meritocrático de un Estado Evaluador.” Reflexión Política 21 (41): 81–90. https://doi.org/10.29375/01240781.2850.
  • Koch, T., R. Vanderstraeten, and R. Ayala. 2021. “Making Science International: Chilean Journals and Communities in the World of Science.” Social Studies of Science 51 (1): 121–138. https://doi.org/10.1177/0306312720949709.
  • Kreimer, P. 2011. “La Evaluación de la Actividad Científica: Desde la Indagación Sociológica a la Burocratización. Dilemas Actuales.” Propuesta Educativa 36: 59–77. https://www.redalyc.org/articulo.oa?id=403041707007.
  • Lamont, M. 2009. How Professors Think: Inside the Curious World of Academic Judgment. Cambridge, MA: Harvard University Press.
  • Lim, M. A. 2021. “Governing Higher Education: The PURE Data System and the Management of the Bibliometric Self.” Higher Education Policy 34: 238–253. https://doi.org/10.1057/s41307-018-00130-0.
  • Liu, J., Z. Yin, W. Lyu, and S. Lin. 2019. “Does Money Accelerate Faculty Mobility? Survey Findings from 11 Research Universities in China.” Sustainability 11: 6925. https://doi.org/10.3390/su11246925.
  • Lounsbury, M.. 2001. “A World of Standards by Nils Brunsson, Bengt Jacobsson, and Associates.” American Journal of Sociology 107 (3): 839–841. https://doi.org/10.1086/343153.
  • Mathies, C., J. Kivistö, and M. Birnbaum. 2020. “Following the Money? Performance-Based Funding and the Changing Publication Patterns of Finnish Academics.” Higher Education 79: 21–37. https://doi.org/10.1007/s10734-019-00394-4.
  • Maxwell, J. A. 2005. Qualitative Research Design: An Interactive Approach. 2nd ed. Thousand Oaks, CA: Sage.
  • McAlpine, L., and C. Amundsen. 2015. “Early Career Researcher Challenges: Substantive and Methods-Based Insights.” Studies in Continuing Education 37 (1): 1–17. https://doi.org/10.1080/0158037X.2014.967344.
  • Ministerio de Educación. 2023. “Fondo Basal por Desempeño.” https://dfi.mineduc.cl/instrumentos-de-financiamiento/asignacion-directa/fondo-basal-por-desempeno/.
  • Mondaca, C., J. Lopatinsky, A. Montecinos, and J. Rojas-Mora. 2019. “Medición del Nivel de Desarrollo de las Universidades Chilenas: Un Análisis con Modelos de Ecuaciones Estructurales.” Calidad en la Educación Educación 50: 284–318. https://doi.org/10.31619/caledu.n50.562.
  • Patton, M. 2014. Qualitative Research and Evaluation Methods. 4th ed. Thousand Oaks, CA: Sage.
  • Pérez, M., and A. Montoya. 2018. “La Insostenibilidad de la Universidad Pública Neoliberal: Hacia una Etnografía de la Precariedad en la Academia.” Revista de Dialectología y Tradiciones Populares 73: 9–24. https://doi.org/10.3989/rdtp.2018.01.001.01.
  • Prior, L. 2003. Using Documents in Social Research. London: Sage. https://doi.org/10.4135/9780857020222.
  • Reinhart, M., and C. Schendzielorz. 2021. “Peer Review Procedures as Practice, Decision, and Governance – Preliminaries to Theories of Peer Review.” SocArXiv, 1–19. https://doi.org/10.31235/osf.io/ybp25.
  • Reymert, I., J. Jungblut, and S. B. Borlaug. 2021. “Are Evaluative Cultures National or Global? A Cross-National Study on Evaluative Cultures in Academic Recruitment Processes in Europe.” Higher Education 82 (5): 823–843. https://doi.org/10.1007/s10734-020-00659-3.
  • Rovelli, L. 2017. “Expansión Reciente de la Política de Priorización en la Investigación Científica de las Universidades Públicas de Argentina.” Revista Iberoamericana de Educación Superior 22 (3): 103–121. https://www.redalyc.org/pdf/2991/299151245006.pdf.
  • Sisto, V. 2017. “Gobernados por Números: El Financiamiento Como Forma de Gobierno de la Universidad en Chile.” Psicoperspectivas. Individuo y Sociedad 16 (3): 64–75. https://doi.org/10.5027/psicoperspectivas-Vol16-Issue3-fulltext-1086.
  • Spence, C. 2019. “‘Judgement’ Versus ‘Metrics’ in Higher Education Management.” Higher Education 77 (5): 761–775. https://doi.org/10.1007/s10734-018-0300-z.
  • Spinello, A. O., E. Reale, and A. Zinilli. 2021. “Outlining the Orientation Toward Socially Relevant Issues in Competitive R&D Funding Instruments.” Frontiers in Research Metrics and Analytics 6: 712839. https://doi.org/10.3389/frma.2021.712839.
  • Viales-Hurtado, R. 2021. “The Problem of Scientific Policies in Central America (1980–2020): The Tension Between Innovation and Social Cohesion in a Global Context.” Tapuya: Latin American Science, Technology and Society 4 (1): 1–22. https://doi.org/10.1080/25729861.2021.1876314.
  • Vohlídalová, M. 2021. “Early-Career Women Academics: Between Neoliberalism and Gender Conservatism.” Sociological Research Online 26 (1): 27–43. https://doi.org/10.1177/1360780420914468.
  • Wetherell, M. 2007. “‘A Step too Far: Discursive Psychology, Linguistic Ethnography and Questions of Identity.” Journal of Sociolinguistics 11 (15): 661–681. https://doi.org/10.1111/j.1467-9841.2007.00345.x.
  • Zacharewicz, T., B. Lepori, E. Reale, and K. Jonkers. 2019. “Performance-based Research Funding in EU Member States – a Comparative Assessment.” Science and Public Policy 46 (1): 105–115. https://doi.org/10.1093/scipol/scy041.