1,486
Views
2
CrossRef citations to date
0
Altmetric
Articles

Rims: The Research Impact Measurement Service At The University Of New South Wales

&

Abstract

In 2005, the Library at UNSW began a comprehensive restructuring process that fundamentally changed the provision of services to its academic community. A primary aim of this process was to increase flexibility of service development and delivery and so to improve research support. The motivation for reformed services arose from considerations including the University Library need to realign its services to support the university’s strategic goals, the increasingly competitive nature of the research environment, the introduction of the RQF/ERA, and a renewed emphasis on research outcomes by UNSW. The measurement of research impact using bibliometrics was seen as a strategy for supporting UNSW researchers. University Library staff consulted the bibliometrics literature and apropriate methodologies were devised to measure the impact of publications, authors and departments. The result was the creation of a Research Impact Measurement Service (RIMS) that now produces over 30 reports every month and employs 6-7 full time equivalent staff. Most of the reports are used to support promotion, grants, and institutional comparisons. This research support service also informs and improves the performance of such traditional library activities as collection development. RIMS is now integral to the measurement of research outputs at UNSW, and has significantly raised the profile of the Library throughout the academic community.AARL June 2009 vol 40 no 2 pp76–87

Introduction

For the past decade, Information Literacy (IL) has been a primary focus of library support to the academic community, and the topic features prominently in Library and Information Studies (LIS) literatureFootnote1. Hence, library support to academic staff has largely been framed around learning and teaching activitiesFootnote2. In contrast, research activities have received only peripheral attention and have in general been far less influential on academic library policies. However, recent changes in government policy in many Western countries such as the UK and Australia have resulted in renewed emphasis on research outcomesFootnote4. Such changes are part of an emerging “culture of evaluation” in relation to the funding of university and research sectorsFootnote5. Consequently, academics at all Australian universities are now charged with the imperative to be ‘research active’, with institutionally defined levels of activity and within a context of government-based research assessment frameworks. Academic libraries have, therefore, been forced to reassess the type of support offered to researchers and parent institutions.

In 2005 the Australian Federal Government published a discussion paperFootnote6, outlining a proposal for a research assessment exercise similar to those already undertaken in other countries such as the UK. The resultant ‘Research Quality Framework’ (RQF) specified that research groups at Australian universities would be required to provide evidence of the impact of their research. This impact is determined by a variety of measures including the analyses of patents, cost-benefits, social return and, citationsFootnote7. The implications for Australian academic libraries were particularly significant, as these institutions have traditionally hosted extensive bibliometric expertiseFootnote8. While the fledgling RQF was superseded by Excellence Research for Australia (ERA)Footnote9 in 2007, the emphasis on measuring research impact remained. ‘Citation Analysis’ although not universally accepted as a measure of impactFootnote10 Footnote11, was specifically identified as one of indicators of Research Quality by the ERAFootnote12. The need to develop appropriate methodologies to assess the impact of citations is therefore of enduring and paramount importance in the new Australian tertiary landscape. The University Library at the University of New South Wales (UNSW) is attempting to meet this challenge.

UNSW is a large metropolitan institution, a member of the Group of EightFootnote13 (Go8) and has strong traditions in research and innovation. In recent years, UNSW has expanded rapidly and now has nearly 40,000 students with over 300 undergraduate and 600 postgraduate programmes. The institution has eight faculties: Built Environment, Engineering, Law, Medicine, Science, Fine Arts, Arts/Social Science and Business. UNSW is affiliated with over 100 research centres, including many of international significance such as the Garvan Institute for Medical Research and the Social Policy Research Centre. The institution also has strong links with the business community and attracts funding from a wide variety of sources.

Restructuring At the university Library

Prior to restructuring, library support to researchers at UNSW was limited to traditional functions such as reference, consultations and IL. After an extensive process of review, the University Library merged five subject-based ‘special’ libraries into one Information Services Department in 2006Footnote14. With the exception of the Fine Arts and Law subject areas, the collections were also integrated into one building and special library staff filled the majority of positions in the new structure. A tiered reference service was also introduced to give professional librarians time to deliver expert services to the UNSW academic community. Such changes typified responses to the new funding environment for Australian universities:

[l]ibraries started to review the number of service points which operated within their organization and began to rationalize these, using their diminishing human resources more efficiently. The value of reference services and the contribution of reference librarians began to be reassessed and academic libraries witnessed the implementation of reference by appointment and the limiting of reference services to primary usersFootnote15

The restructure of information services at the University Library represented a more radical departure from traditional academic library structures, amounting to the adoption of a business modelFootnote16. The new Information Services Department (ISD) was formed and comprised three units: Service Innovation Unit (SIU), Service Development (SDU), and Academic Services Unit (ASU). SIU and SDU were created to provide the research and development functions respectively. ASU is made up of the Outreach Team; the Collection Team and the Services Team. The Outreach Team promotes the University Library to the academic community and elicits feedback. The Collection Team uses evidence-based practices to systematically develop the collection. Finally, the Services Team performs the more traditional services of reference, research consultations and IL. In short the SIU develop ideas, SDU design the products, and ASU provide services directly to the academic community (see Appendix 1).

At this stage, a communications model was instituted to ensure representation of University Library staff at all levels of contact with the academy. This included personalised contact with academics through Outreach Librarians and representation at School meetings; senior ISD managers’ presence and input at Faculty Board meetings, and education and research committee meetings; University Library Senior Management representation on high level university committees; and a regular communication channel through a ‘Library Update’ distributed to the Presiding Members of each Faculty and to all staff. Vitally, the contact with the UNSW academic community ensured a valuable feedback loop.

The new structure was designed to provide flexibility to adapt to changes in the academic and broader information environments. It allowed for dedicated teams which would be responsible for different aspects of the work of academic librarians, rather than expecting them to be generalists, and to initiate and develop new services as extra projects. Furthermore, the restructure was designed to bring greater focus and concentration on particular activities in the work environment, building greater expertise as well as enabling the rapid implementation of new and modified services.

Bibliometrics: a new service

One of the primary roles assigned to the new SIU was the definition of new services. Extensive consultation of the LIS literature confirmed that the majority of academic libraries, both national and international, offered only very traditional support to researchersFootnote17. These services included document delivery, acquisition, research consultations, reference, bibliographic software classes, and alerting services. Bibliometrics did not feature on the list, despite its potential benefits for libraries. Nevertheless, recent studies had identified bibliometric analysis as a “new business area for information professionals in libraries”, and an understanding was emerging that associated measures may eventually form the basis of new research evaluation systems Footnote19

Citation counts have been used as a way of “evaluating the importance of scientific work” since 1927Footnote20, and the benefits of bibliometrics to collection management practices were identified over a quarter of a century agoFootnote21. Until recently, however, no evidence was available to indicate the existence of a comprehensive bibliometric service at a university library in Australia. In 2007-8, the Library at the University of South Australia provided some bibliometric support to academic staff in the form of a basic h-indexFootnote22 (see below). In Germany, Lund UniversityFootnote23 and Forschungszentrum JülichFootnote24 have already implemented bibliometrics services. Recently Elsevier also outlined a number of forays into bibliometrics by academic libraries in the South Korea, Austria, Lebanon, and the United States of AmericaFootnote25.

In December 2006, SIU contributed to a discussion paper for UNSW senior management outlining potential ways the Library could support the university through RQF processes. Soon afterwards, ISD offered limited training to UNSW staff on measuring research impact, including instruction on the use of citation databases. This eventually evolved into a seminar, delivered in association with the UNSW Research Office, entitled ‘Measuring Research’, which was attended by over 300 academic staff. This event generated a significant degree of interest regarding possibilities for measuring research in terms of citations. New interest was monitored by Academics, being time-poor, sought assistance with this new and complex environment. Many required support to complete citation analyses due to the technical complexities of citation databases and their lack of conceptual understanding of the bibliometric landscape. Furthermore, most academics at UNSW were unable or unwilling to develop the required skills, as a result of pressures to extend this type of expertise in their own disciplines. Hence, there arose a new opportunity to develop a research impact measurement service in the only university body with the required expertise: the Library.

In 2007, the Vice Chancellor also announced a new set of strategic directions for UNSW, including an intention to improve research performance, particularly in comparison to and in competition with the research-intensive Go8 universitiesFootnote26. At the same time, the University passed a number of data related services to support the research process to the Library, such as the research publications data submission for the Higher Education Research Data Collection (HERDC). As the University Library was attempting to align itself more closely with the strategic directions of UNSW, the development of services to improve research outcomes became more pressing. It became increasingly apparent to the University Library that bibliometrics may provide the methodology to support the wider UNSW research community in the new research outcomes focussed environment.

A literature search revealed that there existed only a small amount of published information on libraries introducing bibliometrics services. It appeared that the University Library would need to define a service based on the emerging needs of UNSW researchers. A new application from Thomson ISI called Publication Activity Reports, a component of Journal Use Reports, became a tool of significant interest. This product made possible the production of School based reports, generated on publication and citation patterns using Web of Science data. A presentation of this data was delivered to a key Faculty of Medicine research centre at a seminar for their researchers. It was extremely well-received and further confirmed the need for a defined service around the measurement of research impact, and of ways to enhance the impact of research outputs.

As with each new service instigated by SIU a ‘Service Definition’ document was prepared for the Research Impact Measurement Service (RIMS) at the end of 2007. It outlined the strategic intent of the service; described the elements of the service and the associated accountabilities; included a ‘sales pitch’ for the Outreach Team to ‘sell’ the service to the academy, and suggested some success criteria and associated performance measures that could be used to review the service at a later date. Measurement of the research impact was initially proposed in relation to both individuals and institutional bodies such as schools and faculties. Bibliometrics would provide the toolset for the measurement of this impact, delivered through the vehicle of RIMS.

The creation of the RIMS service was made possible by the radical restructure of the University Library. For the first time a dedicated innovation team (SIU) was able to identify new products in response to the needs of the academic community. These needs were informed by the Outreach Team that specialised in communicating with academic staff. The new SDU was then able to develop specialised product. Finally the new multi-disciplinary Services team, freed from the other more traditional responsibilities due to the clear delineation of roles within the new structure, was then able to deliver material according to SIU specifications. This team assumed responsibility for completing the new reports of the fledgling RIMS service. Initially it was envisaged that about 30% of work in the new role would be allocated to reference, research consultations and RIMS respectively. It soon became evident, however, that RIMS would constitute at least 50% of the role of a Services Team member, making this role unique in the Australian academic library landscape.

services to Individual Researchers

SIU identified methodologies for capturing the performance of individual researchers. A wide variety of measures were required, as methodologies vary greatly depending on individual disciplines. The h-index is a score derived from the distribution of citations to an author’s publications, and was originally developed by Jorge Hirsch to compare the impact of physicistsFootnote27, is widely accepted as an indicator of the ‘impact’ of an author. This methodology is suitable for researchers in the sciences, engineering, and medicine discipline areas. A basic h-index can be calculated from a single citation database, but results are usually inaccurate due to the fact that all citations to a particular publication are rarely featured in a single source. Therefore, the results of several citation databases need to be combined and duplicates removed in order to calculate the most accurate result. This complex process involves considerable time and demands substantial expertise. SIU developed a methodology for calculating and presenting an h-index with supporting documentation in the form of a spreadsheet.

This h-index (HI) spreadsheet provides details of each publication and citing articles. This proved essential in the identification of the nature and impact of particular pieces of research and also in measuring the overall impact of the researcher. After a few months of providing this service, it became apparent that not all researchers required such detailed information. Some researchers merely required a summary of their publications and related citations without the detailed backing data, particularly for performance review and promotion purposes. Consequently the ‘short h-index’ was developed, and it quickly became as popular as the fuller version, especially as delivery times were considerably shorter.

Support for academics in humanities and social sciences is much more problematicFootnote28, as publications in these disciplines are often not indexed in citation databases such as Scopus and Web of Science. The publishing and citation behaviour of researchers in these disciplines is also quite different to the sciences. Therefore, SIU developed a technique for identifying the top 5-10 publications of a researcher (depending on the total volume of publications), and conducting a comprehensive citation search based on a wide variety of databases including Google Scholar. This product became known as a Research Impact Statement (RIS), and is also presented to the academic in the form of a spreadsheet.

An alternative measure for disciplines outside of the Sciences was provided in the form of a Citation Count. A similar method to that used to calculate the h-index is used, but in this case the details of citations only are presented for each publication. This usually involves time consuming searching using the Google Scholar database and is presented in the form of a spreadsheet. Considerable resources, however, are required for this product. Although the results are often quite illuminating in terms of the broader impact of the researcher’s outputs, in many instances the RIS has proved to be quite adequate.

After producing HIs, CCs and RISs for several months, it became apparent that many researchers were primarily concerned with summaries, and that much of the detail provided in existing reports was unnecessary. This was especially true when the information was requested for the purposes of a grant application. Feedback from the Outreach Librarians and other faculty communication channels indicated that the compilation of this sort of information was a struggle for many researchers, particularly those in the early stages of their careers. In such cases, the information from the RIMS report is often copied directly onto the relevant grant application document. Accordingly, the Grant Application Statement (GAS) was developed. This report can usually be produced within a two week period and consists of a range of research impact measures (Table ). In some cases, a researcher can also nominate a colleague in another institution for direct comparison. The potential of other new bibliometric measures, such as the ‘Libcitation’Footnote29 method, is also being explored by SIU.

Table 1 A typical researcher profile in a GAS

Services to Schools and Faculties

In many cases, researchers in UNSW schools, faculties and research centres are working in relative isolation from their colleagues at other institutions. It is often very useful, therefore, to have access to comparative data, especially in the current environment of competitive research funding. Consequently SIU developed a technique, largely based on ‘Web of Science’ data and tools provided by Thomson ISI, to analyse the publication and associated citations of an academic department, school or faculty. In some cases, comparisons are made with bodies external to UNSW. Usually, data is considered for the previous five year time period. This report was termed a Publication Activity Report (PAR) and has become very popular with faculty senior management as a means to assess performance in relation to Go8 equivalents or other nominated universities or research centres. This report is presented in a document featuring graphs, tables and analysis. In 2008 PARs were compiled for several individual schools as well as for an entire faculty. The report for the faculty was especially problematic, as it involved comparison with other G08 equivalents. This necessitated a complex process of identifying equivalent areas of research at other institutions in order to perform meaningful comparisons.

Another option offered to researchers is the ‘Research Trends Report’ (RTR). This report is more generalist in attention and nature, and provides an overview of trends and patterns within a nominated discipline. Aspects such as emerging areas of research, high performing institutions or researchers and important publications are highlighted. The analytical tools and search-refining features provided within Web of Science and Elsevier’s Scopus have been used to develop these reports. Recently, reports have been compiled for completely new areas of research, and other ‘competing’ institutions have been identified. This information is also provided in the form of a written report including graphs and tables.

A further product offered to indicate and elucidate research trends is the Journal Impact Report (JIR). This relatively short report is a summary of the impact of journals within a specified discipline. In some cases leading journals are identified and in others the performance of a journal over several years may be highlighted. It is usually presented in a tabular format based on ‘Impact Factor’ data from ISI’s Journal Citation Reports; journal trend analysis from Scopus; and from ERA journal ranking data. JIRs are particularly useful for researchers attempting to identify the most appropriate publications for their research outputs.

summary of Rims Products (march 2009)

The RIMS work has increased the University Library’s involvement in many other research support activities. This includes analysis for faculty and school review processes, of both internal and external nature. Analysis of data has also been undertaken to inform Faculties on issues regarding indicator analysis for the ERA trials. This includes performance of faculty research outputs against journal rankings for the various Fields of ResearchFootnote30 undergoing assessment; and assistance with submissions on changes to proposed journal rankings. The service is also being expanded to meet other articulated needs such as advice on developing publishing strategies, and requests for training programs for early career researchers on measuring impact.

As shown in Table over 30 RIMS reports are now completed each month. Coordinating the production of so many different requests from ten Outreach Librarians, allocating each task to one of thirteen Services Librarians, and finally delivering the report to the researcher is obviously and inevitably challenging work. Demand is also inconsistent, and at certain times, such as during academic promotion rounds, there may be as many as 50 RIMS reports – at varying degrees of completion – in the system. ‘RefTracker’™Footnote31, already deployed by the University Library to administer other services such as tiered reference, has also been successfully used to manage these complex RIMS workflows.

Table 2 Summary of the various RIMS reports

The Impact Of Rims On The Wider University Library At Unsw

RIMS has radically redefined the way that the University Library engages with the academic community at UNSW. Furthermore, RIMS has transformed the duties of the thirteen member Services Team who had previously been responsible for the more traditional responsibilities of reference, information literacy and research consultations. RIMS now accounts for over 50% of their work, has renewed levels of enthusiasm, and created a new sense of purpose. This last contribution is especially important in the current environment, in which many librarians feel that their relevance is waning due to ‘Google-isation’ and the information revolution.

While RIMS has been an innovative and highly sought after service to the UNSW academic community, it has also informed existing practices at the University Library. Knowledge gained about publishing and citation patterns of UNSW researchers has guided collection development, and will allow the University Library to more closely align the collection with the needs and practices of UNSW researchers. Understanding of publishing behaviours has also enabled the identification of research groups likely to benefit from the use of the institutional repository to increase their impact. It has also highlighted groups that could benefit from training programs and advice on higher-impact publishing.

Conclusion

Renewed emphasis on research at UNSW, together with the restructuring of the University Library has provided the opportunity to support the academic community in innovative ways. The main focus of this transformation has been the development of a service to provide valuable information for individual promotion, grant applications and institutional comparison. The development of this highly valued service has been made possible by the formation of the SIU (innovation), SDU (development), the Outreach Team (sales), and the Services Team (report preparation). The new flexible information service structure at the University Library at UNSW is very well-placed to respond to the changing academic environment and to further incorporate support for research in the general strategic direction.

RIMS is now of primary importance with seven full-time equivalent staff employed directly in the creation of RIMS reports, although due to other functions, these tasks comprise no more than half of any individual’s total responsibilities. The comprehensive nature of such a research support service using bibliometrics appears to be unique among Australian academic libraries. The rapid uptake of RIMS products and the positive feedback from the UNSW researchers and faculty administrators indicates that this service has become invaluable, and has significantly elevated the profile of the University Library in the UNSW community. What is more, this service is still evolving. The early surge in the demand for h-index calculation, for example, has now been replaced by demand for more generalist GAS and PARs.

As a result of articulated demand from researchers, investigation is already underway into potential new RIMS products such as competitive intelligence reports, at both individual and research group level; and advisory services on strategies for increasing the impact of their research and personal profile. Investigation has begun into ways of automating elements of RIMS and into means of exploiting the new measurement tools being developed by database publishers. It is expected that further evolution and refinement of RIMS will occur as the impacts of the ERA and new funding environment becomes apparent. As the RIMS service develops, the ‘measurable’ value of the service will become more clear, but the initial indications are very positive.

Notes

1. Korobili, Stella, Aphrodite Malliari, and George Christodoulou. 2008. Information literacy paradigm in academic libraries in Greece and Cyprus. Reference Services Review 36, (2): p180.

2. Kuh, G. D., and R. M. Gonyea. 2003. The role of the academic library in promoting student engagement in learning. College & Research Libraries 64, (4) (Jul): pp256-82.

4. Geuna, A and Martin, B.R. 2003. University Research Evaluation and Funding: An International Comparison. Minerva 41, 4: pp277-304.

5. , Linda 2003. Explaining Australia’s increased share of ISI publications - the effects of a funding formula based on publication counts. Research Policy 32: pp143 – 55.

6. Expert Advisory group for an RQF 2005 Research Quality Framework:

Assessing the quality and impact of research in Australia. Canberra: Commonwealth of Australia. http://www.dest.gov.au/NR/rdonlyres/ B851C4B4-7F66-4F91-964B-76ECBB527E7A/5618/adv_approach.pdf (accessed 23 March 2009)

7. Johnson, R. 1995 Research impact quantification. Scientometrics 34(3): p415

8. Haddow, Gaby. 2007. Academic Libraries and the Research Quality

Framework. Australian Academic and Research Libraries 38, 1: pp26-39

9. http://www.arc.gov.au/era/default.htm (accessed 23 March 2009)

10. Bornmann, Lutz, and Hans-Dieter Daniel. 2008. What do citation counts measure? A review of studies on citing behavior. Journal of Documentation 64, (1): p45

11. Warner, J. 2000. Critical review of the application of citation studies to the Research Assessment Exercises Journal of Information Science 26(6): pp453-460.

12. Australian Research Council 2008 ERA Indicator Descriptors, Canberra:at Commonwealth of Australia http://www.arc.gov.au/pdf/ERA_Indicator_Descriptors.pdf (accessed 23 March 2009)

13. http://www.go8.edu.au/ (accessed 23 March 2009)

14. Wells, Andrew. 2007. A prototype twenty-first century university library. Library Management 28, (8/9): p450.

15. Burke, Liz. 2008. Models of reference services in Australian academic

libraries. Journal of Librarianship and Information Science 40, (4) (December 1): pp269-86

16. Bosanquet, Lyn. 2009. Deliver a positive ROI - can be value be measured?

Paper presented at Information Online: 14th Annual Exhibition and Conference, Sydney Conference and Exhibition Centre, http://www. information-online.com.au (accessed 23 March 2009).

17. Webb, Jo, Pat Gannon-Leary and Moira Bent. 2007. Providing effective library services for research. London: Facet Publishing.

19. Joint, Nicholas. 2008. Bemused by bibliometrics: Using citation analysis to

evaluate research quality. Library Review 57, (5): p346.

20. Bornmann, Lutz, and Hans-Dieter Daniel. 2008. What do citation counts

measure? A review of studies on citing behavior. Journal of Documentation 64,

(1): p45.

21. Warr, R. B. 1983. Bibliometrics: A model for judging quality. Collection

Building 5, (2): pp29-34

22. Gibbs, Carole and Sergeant, Kate. 2009. Opportunity not hard work: Scripted solutions to solving our bibliometric nightmare. Paper presented at Information Online: 14th Annual Exhibition and Conference, Sydney Conference and Exhibition Centre, http://www.information-online.com.au (accessed 23 March 2009).

24. http://www.fz-juelich.de/zb/Bibliometrics/ (accessed 24 March 2009)

25. Elsevier. 2007 Library Connect Newsletter, Aug: pp8-9

26. B2B Blueprint to Beyond 2010: UNSW Strategic Intent http://www.unsw. edu.au/about/pad/B2B_UNSW_Strategic_Intent.pdf

27. Hirsch, J. E. 2005. An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences of the United States of America 102, (46) (Nov. 15): pp16569-72.

28. Council for the Humanities, Arts and Social Science. 2005. Measures of quality and impact of publicity funded research in the humanities, arts and social sciences. Occasional Paper No. 2. http://www.chass.org.au/papers/ pdf/PAP20051101JP.pdf (accessed 20 April 2009).

29. White, H. D. et al. 2009. Libcitations: A measure for comparative assessment of book publications in the humanities and social sciences. Journal of the American Society for Information Science and Technology: [forthcoming].

30. Australian and New Zealand Standard Research Classification (ANZSRC), 2008.

31. ://www.altarama.com.au/reftrack.htm (accessed 24 March 2009)

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.