289
Views
0
CrossRef citations to date
0
Altmetric
Articles

Scientometric analysis of the publishing behaviour of EU + UK authors in engineering education and further afield

ORCID Icon & ORCID Icon
Pages 389-410 | Received 02 Mar 2022, Accepted 18 Nov 2023, Published online: 07 Dec 2023

ABSTRACT

The authors present a scientometric procedure comparing engineering education (EE) publication output in different countries. Selected European countries were compared using a snapshot of authors published in EE journals during a two-year period – 895 in all. The entire publication careers of these authors – 39,322 publications – were analysed to determine the breakdown of educational and non-educational publications. Spain and the UK produced the most publications, and we propose explanations based on relevant national policies. France, Germany, and Italy had notably fewer EE publications relative to their general engineering and science research outputs, whereas Portugal, Ireland, and three Nordic countries were the opposite. Countries varied widely in the ratio of educational to non-educational publications. Non-educational publications typically had a greater impact on authors’ h-index values. We believe this procedure can longitudinally map and compare EE publication output in and across Europe, providing a valuable resource for policy-makers and researchers in the region.

Introduction

Published research on EE can be broadly divided into two groups: studies that looked at broad tendencies in the field and those that focused on particular countries (e.g. US, UK, Portugal, Australia, Malaysia) or small groups of countries (e.g. the Nordic countries, Spain and Portugal).

Within a European context, the authors only identified one study that takes a Europe-wide perspective to look at EE publication patterns (Lima and Mesquita Citation2018) and the approach it adopted has significant limitations. The present study aims to provide a more complete comparison of European output by applying scientometric analysis. In addition, we propose that our procedure can be adopted in future work to track EE publication evolution within the region.

We firstly take a snapshot of 895 authors in European countries that published in the leading EE journals in the years 2018 and 2019. We then go on to look at the entire publishing careers of these authors.

Literature review

There has been a growing interest in the evolution of EE as a field of inquiry since the turn of the century. Froyd and Lohmann (Citation2014) used criteria for defining the field of science education research (Fensham Citation2004) to point out that while engineering education has been seen as an area of interest for educators since the end of the nineteenth century, over the last two decades there have been significant indicators of a transition to an interdisciplinary, more scholarly field of scientific inquiry into engineering education.

However, Klassen and Case (Citation2022) recently published an article that takes a global look at the scholarship in this field to date and they conclude that ‘The argument for engineering education research as a strongly classified field has served value in establishing legitimacy and associated resources in some contexts but has not yet delivered a unique knowledge base for such legitimation’.

We have only been able to identify one study that compared all the European countries in terms of their EE publication output. This investigation by Lima and Mesquita (Citation2018) analysed the Scopus database using the search-term ‘engineering education’ and identified 4604 journals articles from European affiliated authors published between 1970 and 2017. Based on their study, these authors concluded that engineering education research was growing in Europe and had surpassed the USA by 2018, but that Europe’s output relative to the whole world was decreasing. However, as these authors based their entire analysis on the presence of this single search-term, it is likely their sample included many articles that were technically rather than educationally focussed. Of the 25 journals they included, many did not have education as their main focus: eight were engineering journals while four were devoted to educational technology. In addition the authors did not apply any value criterion to their search results.

Moving from the general to the particular, we find a range of studies:

Borrego and Bernhard (Citation2011) have compared Northern and Central European approaches to EER with those of the US using a framework from the European didaktik tradition, which focuses on answering the w-questions of education. Borrego and Olds (Citation2011) employed an analysis of National Science Foundation funded projects as a way of characterising development in engineering education research in the US while Williams and Alias (Citation2011) used a scientometric approach to track the evolution of EE publication in Malaysia. Dart and Cunningham-Nelson (Citation2020) used textural analysis to identify publication trends in the Australasian context.

Neto and Williams (Citation2017) analysed historical studies of the European Journal of Engineering Education (EJEE) to provide insights on the European context. Other studies looked at specific European national contexts such as the Nordic countries, the UK and Portugal (Edström et al. Citation2018; Nyamapfene and Williams Citation2017; van Hattum-Janssen, Williams, and Nunes de Oliveira Citation2015; Williams, Wankat, and Neto Citation2018; Wint and Nyamapfene Citation2021).

A small set of data from the Australian context was originally reported based on analysis of three EE journals (Valentine Citation2020). The present authors went on to compare the EE publication profile of scholars based in two Southern European countries, Spain and Portugal (Valentine and Williams Citation2021a), and four Nordic ones, Denmark, Sweden, Finland and Norway (Valentine and Williams Citation2021b), and these six European countries with Australia (Valentine and Williams Citation2021c).

While the publications listed above present a range of scientometric approaches that have been gathering traction to study engineering education publications in recent years, related fields such as science education (Wang et al. Citation2023), computing education (Apiola et al. Citation2022; López-Pernas, Saqr, and Apiola Citation2023), and mathematics education (Akin and Güzeller Citation2022) have also seen increasing attention being directed to scientometric analysis.

Motivation for the study

In line with the EJEE scope which encourages submissions that combine scholarliness with usefulness for improving engineering education, the primary motivation for collecting and analysing the data presented here was to provide a resource for EE researchers and policy makers in European countries. Although there are macro aspects of research policy determined at EU level (for example the Horizon Europe and Erasmus + programmes), most policy with respect to engineering education research is determined at either national or institutional level, the only exception we are aware of being the Nordic Network which has taken a regional approach in supporting engineering education research. For this reason, the authors have opted for a country-based unit of study.

Comparative data at country level is expected to be useful for decision makers at national and institutional level considering changes in engineering education for example with respect to providing funding for projects, hiring of staff or the creation and accreditation of PhD programmes. Apart from the value of such empirical data for policy makers, we see it being useful to European engineering education practitioners in general while being particularly valuable to scholars setting out on a path to doctoral research. As there are many European countries not currently offering PhD programmes in EE (Wint et al. Citation2023) there has been a tendency for would-be candidates to move country to pursue a PhD in the field.

In addition, a robust procedure for collecting and comparing data can serve as a baseline for the collection of longitudinal data to measure progress in the evolution of EE publication output at country level.

Methodology

As the only previous attempt to provide data on engineering education publishing output at country level, Lima and Mesquita (Citation2018), had the limitations described earlier, the authors set out to provide more granular and detailed data that could serve as both a resource and as a baseline for future research.

The authors adopt a pragmatic world-view (Creswell and Creswell Citation2017, 27) in approaching this investigation and opted for a quantitative method (Creswell and Creswell, Citation2017, 106)

Towne and Shavelson (Citation2002) identify three categories of research question for scientific research in education:

  • Description – What is happening?

  • Cause – Is there a systematic effect?

  • Process or mechanism – Why or how is it happening?

Given that this type of country-based comparative scientometric analysis is relatively new in EE, it is to be expected that they can be characterised within Towne and Shavelson’s first category. We set out to address three research questions:
  1. What can we learn by comparing the EE published output by authors in the EU + UK countries with the same country’s output in scientific and engineering (S & E) technical fields?

We hypothesised that the patterns would be broadly similar for most European countries.
2.

For Europe-based authors publishing research in education and in non-educational fields, what can we learn by comparing the respective outputs during their entire publication careers?

3.

What insights for policy makers, EE researchers and in particular future doctoral candidates are provided from the analysis of the h-index of Europe-based authors that publish research in education and in non-educational fields?

The data in responding to questions two and three can be important in recruiting and promotion processes where EE scholars may be in competition with colleagues who publish predominantly technical research.

Procedure

While there are many bibliometric analysis techniques or methods (Donthu et al. Citation2021; Zupic and Čater Citation2015), this study focuses on specific indicators that are widely used in bibliometric research: number of publications (Haustein and Larivière Citation2015), citation analysis (Haustein and Larivière Citation2015), distribution of publications by type (Hall Citation2011), and h-index (Donthu et al. Citation2021).

This study also introduces one new bibliometric indicator that has not been reported in the literature; the distribution of educational and non-educational publications. The creation and usage of this new bibliometric indicator was significant because it was necessary to be able to answer research question three. The creation of this new bibliometric indicator was novel because (to the best of the authors knowledge), no studies in the existing literature have previously attempted to determine the distribution of educational and non-educational research publications for a set of authors. Therefore, this is the first study to present this style of analysis in the literature.

shows the outline of how data was gathered and analysed in this study. The steps were grouped into five main stages: country selection, selecting the data source, data selection and processing, data collection, and classification.

Figure 1. Outline of how data was gathered and analysed.

Figure 1. Outline of how data was gathered and analysed.

The remainder of this section will discuss each of these sections in detail. The headings of each section match those in for clarity of the procedure.

In summary, we opted to identify the number of EU and UK affiliated authors publishing in 13 EE journals in the 2018–2019 period and then to analyse their entire publication careers in both educational and non-educational fields and to see how this influenced their publication records.

The time period of 2018–2019 was selected as this was before the COVID-19 pandemic. It was necessary to select a two-year period to ensure that there were sufficient numbers of authors for meaningful analysis. The time period 2019–2020 was not used as this would have introduced the occurrence of the COVID-19 pandemic as a possible variable which may have notably influenced the results during 2020.

Identify target countries

Altogether there were 28 countries considered for analysis – the 27 EU countries and also the UK which was an EU member up to January 2020 and thus was part of the union during the period in which we collected data. Only countries with at least ten affiliated authors published in the period 2018–2019 were included in the final synthesis. This means that the initial list of 28 countries was later reduced to 17 as is explained in the relevant section below.

It is important to note that the number of authors from each country may change on a yearly basis due to the differing number of publications and authors each year – this is important to understand, but is a factor that cannot be controlled. This means that the data presented in this study may vary if the analysis was conducted on authors who published in years other than 2018–2019. This is a limitation of the study. For context, Appendix 4 shows the number of authors for 2018, 2019 and the neighbouring years (2017, 2020), to understand how stable the conclusions are. As shown, the number of authors from each country can vary on a year-by-year basis. Note that the number of authors in Appendix 4 is based on the authors’ location at the time of publication (where the other data in this study is based on the author’s location in 2021, as explained in the methodology). This explains why the numbers in Appendix 4 differ from , for example.

Select research database

Scopus was selected as the sole research database because it is a large multi-disciplinary research database which was needed to ensure that authors’ publications from across a wide diversity of research disciplines were likely to be recorded. Scopus was also selected over alternatives such as Web of Science because it has indexed more publications, and some prominent engineering education journals were not indexed by other research databases (e.g. Australasian Journal of Engineering Education was not indexed by Web of Science at the time of writing). It was also not suitable to combine the use of Scopus with another research database, because the number of citations a paper can vary slightly between different research databases, which would pose a problem for citation analysis if records from multiple research databases were combined.

Data were gathered from the Scopus API (http://api.elsevier.com and http://www.scopus.com) in 2021 using the pybliometrics Python library (Rose and John Citation2019). Data was gathered over several months due to limitations of the Scopus API which only permits download of data about 10,000 articles per week.

Identify engineering education journals that published articles in 2018–2019

A comprehensive list of EE publications from each of the respective countries was required. To create this list, thirteen research journals relevant to the field of engineering education were consulted (). The journals were identified by conducting a search on Scopus. The initial criterion for selection was that the journal title indicates an EE focus by inclusion of ‘engineering’ along with ‘education’ or ‘pedagogy’. This identified a list of 22 possible journals. Following this, several journals were excluded based on exclusion criteria which were inductively created.

Table 1. EE journals included in the study.

Criteria for excluding a journal included that:

1.

the journal must be currently still active (at the time of analysis)

This excluded:

  1. Chemical Engineering Education (inactive since 2017)

  2. Engineering Education (inactive since 2014)

  3. International Journal Of Applied Engineering Education (inactive since 1992)

2.

the journal must focus solely on engineering education, and not other disciplines

This excluded:

  1. World Transactions On Engineering And Technology Education

  2. Engineering Science And Education Journal

  3. International Journal Of Cognitive Research In Science Engineering And Education

  4. Journal of Technical Education and Training

3.

the journal must focus on engineering education as the primary focus, rather than technology in education or software applications

This excluded:

  1. Computer Applications in Engineering Education

4.

the journal must focus on engineering education at the tertiary or university level

This excluded:

  1. International Journal Of Continuing Engineering Education And Life Long Learning

  2. Journal Of Pre College Engineering Education Research.

Identify all articles in these journals published 2018–2019

For each journal, a list of all the publications published by the venue in 2018–2019 (inclusive) were retrieved. The purpose of this was to gather a comprehensive set of EE publications from which to gather information about EE authors. Publication types such as editorials, letters, erratum or notes were excluded.

Retrieve list of all authors from each of the selected countries

For each country, the list of all authors who had published at least one article between 2018 and 2019 (inclusive) was considered. The affiliation information in Scopus was used to establish which country (or countries) each author was affiliated with. Author affiliations were based on Scopus information for September 2021. If an author was affiliated with multiple countries, then they were listed for each. Thus we have 895 unique authors but 901 authors overall allowing for 6 with dual affiliation (see details in Appendix 2)

Exclude countries with less than 10 authors

Eleven countries with less than ten authors (Hungary, Lithuania, Malta, Cyprus, Czech Republic, Croatia, Bulgaria, Estonia, Latvia, Luxembourg, and Romania) – including those with no authors – were excluded to aid readability of the data due to the very small number of authors. This meant that seventeen countries were included in the final synthesis.

Retrieve full publication history of each author from research database

Comprehensive details for each author were then retrieved from Scopus. This included their full publication history in all fields. For subsequent analysis, only articles, conference papers, reviews, book, and book chapters were included. Other publication types such as editorials, letters, erratum or notes were excluded. Key details of each publication were captured including, but not limited to, document title (Scopus field: ‘title’), publication name (e.g. JEE) (Scopus field: ‘publicationName’), document publication year (Scopus field: ‘coverDate’), document type (e.g. article) (Scopus field: ‘subtype’), author keywords (Scopus field: ‘authkeywords’), subject areas (Scopus field: ‘subject_areas’), citation count (note that this can change over time; this is a limitation of the study), and DOI. A total of 39,322 publications until the end of 2020 were captured for the authors (see Appendix 2).

Classify papers as educational or non-educational

Publications were subsequently classified as being either educationally focused or non-educationally focussed. The purpose of this was to build an understanding of how educational and non-educational publications contribute to the research track record of each author. Because this involved analysis of thousands of publication records, it was not feasible to do this manually. Therefore, a computer-aided approach was required to assist with automating the process: an algorithm was created, using a combination of keyword search and Scopus data fields.

An extensive manual scoping search involving several iterations, coupled with subsequent testing, was undertaken to identify suitable combinations of Scopus data fields and keywords (this is similar to how a search string is constructed during the scoping search stage of systematic literature review).

A publication was deemed to be educationally focussed if:

1. any of the following three Scopus data fields:

‘authkeywords’ i.e. author keywords – the custom terms which authors are able to use to describe the contents of their publication.

‘subject_areas’ i.e. The subject areas in the Scopus databased used to classify the journal – the such as ‘Biotechnology’, ‘Education’, ‘Computer Science’ (miscellaneous). Scopus explains that for subject areas, ‘classification is done by in-house experts at the moment the serial title is set up for Scopus coverage; the classification is based on the aims and scope of the title, and on the content it publishes.’ (Scopus Citation2022)

‘publicationName’ i.e. the name of the journal such as ‘Journal of Engineering Education’, ‘Chemistry Education Research and Practice’, ‘Journal of Environmental Management’

included any of the following terms

   ○ ‘education’, ‘student’, ‘teach’, ‘tutor’, ‘novice’, ‘MOOC’, ‘ASEE’, ‘SEFI’

OR

2. the Scopus ‘title’ data field i.e. the author defined title of the research publication (not the title of the journal)

included the term ‘learn’

○ AND the term ‘learn’ appeared at once outside the term ‘machine learn’

The inclusion of criterion 2 was necessary because ‘learn’ was identified as a term that was absolutely essential for some papers to be correctly flagged as educational (i.e. there were no other terms which may have worked). However, an issue arose where papers in ‘machine learning’ were then often flagged as educational when they were not (this is also why ‘learn’ was restricted to the ‘title’ field). To try and address this issue, it was required that ‘learn’ appeared at least once in the title outside the context of the term ‘machine learn’. This increased the accuracy, but some machine learning publications were still incorrectly flagged as being educationally focused.

To test the efficacy and accuracy of this algorithm (compared to human judgement), a random subset of 670 publications were manually coded by the authors as being either educationally focused or non-educationally focused. This was then compared to the output of the algorithm.

400 papers from the Portugal, Spain authors were checked

there was a 99.7% agreement between human judgement and the algorithm

270 papers from the Denmark, Finland, Sweden authors were checked

there was a 97.4% agreement between human judgement and the algorithm

There was an overall 98.8% agreement between authors and the algorithm (6 false positives, and 2 false negatives). This was deemed to be acceptable accuracy for analysing the larger dataset and making conclusions (with the acknowledged limitation that about 1.2% of publications may be incorrectly flagged).

Synthesise information about authors’ publications

Following this, information for each of the 895 authors was then established, including:

  • the number of years the author had been publishing, and how long they had been publishing educational papers

  • the distribution of the publications by document type including articles, conference papers, book chapters, books, and reviews.

  • the percentage of publications which were educationally focussed

  • the number of citations on educational and non-educational publications

  • the author’s overall h-index, and that of their educational publications, and non-educational publications

Results

Distribution of authors by country

We began by identifying the country affiliation of the authors that published in the 13 EE journals indexed by SCOPUS that make up our total sample ().

Table 2. Summary of the number of authors from each country, and the total number of educational and non-educational papers by authors from each country (authors from 13 EER journals).

demonstrates that Spain had by far the highest number of authors (381). The United Kingdom was second with 120 authors. Other countries with more than 30 authors included Portugal (56), Greece (44), Germany (42), and Italy (38).

We then applied a value criterion to our search by identifying the authors who published in the three engineering education journals listed by Scimago in May 2022 as being both in the first quartile in the Education category and also in the first quartile for one or more technical categories such as Miscellaneous Engineering or Chemical Engineering. This included the European Journal of Engineering Education, IEEE Transactions on Education, and Journal of Engineering Education. The purpose of this was to understand whether authors from certain countries were more likely to publish in first quartile journals.

shows that the countries in our sample with 10 or more authors who published in the first quartile journals were from the United Kingdom, Spain, Portugal, Sweden, Finland, Ireland, and Germany. Several countries had no authors who had published in the first quartile journals (Austria, Hungary, Serbia, Slovakia, Slovenia), while the remainder had under ten authors who had published in the first quartile journals.

Table 3. Number of authors from each country who published in Q1 engineering education journals in 2018–2019 (based on Scimago rankings).

Distribution of educational and non-educational publications from a national perspective

shows a breakdown of the total number of educational and non-educational publications by authors from each country (i.e. the sum of all educational publications by all authors from a country). As shown, most countries have produced more non-educational publications. Exceptions include Ireland as its authors had considerably more educational than non-educational publications and Belgium and the Netherlands which have a similar number of educational and non-educational publications.

We were interested to understand the characteristics of a ‘typical’ individual author from each country. A possible limitation of this information is that it aggregates the total number of publications by all the authors from a given country. Thus, authors who have had a long research career and published many publications will have a much larger impact on these numbers than authors who have only recently started their research careers. To be able to make generalised research profiles which represented a ‘typical’ individual author from each country, it was necessary that the generalised researcher profile would not be heavily biased by researchers with a long research career. The approach that was adopted was to consider the research track record of each author as having the same weighing. This meant giving the same weighting to researchers who had 12 or 100 publications, for example. Therefore, it was decided to build the generalised research profiles by converting the raw numbers associated with each author into ratios of their overall publications. These could then be averaged across all the authors from a given country, to build the research profile of a ‘typical’ individual author. The results of this are shown in .

To clarify as an example, shows that the total number of educational publications by authors from Austria was 239, and the total number of non-educational publications by authors from Austria was 304. Dividing by the number of authors (28) gave the numbers shown in ; 8.5 educational publications on average, and 10.9 educational publications on average. This was then converted to a ratio (column 4 in ) to enable easier comparison between the countries.

Table 4. Distribution of publications as educational or non-educational, categorised by country. N represents the number of authors from that country.

demonstrates that the publication patterns of authors from different countries in our sample was variable. On average, authors from 16 countries published more non-educational papers than educational papers, and only authors from 1 country (Ireland) tended to publish more educational papers than non-educational papers. Authors from Ireland in our sample published the most educational papers on average (2.78 educational papers per non-educational paper), while authors from Italy published the fewest educational papers on average (0.06 educational papers per non-educational paper). Authors from Austria (ratio of 0.79:1), Belgium (0.70:1), Slovakia (0.77:1) and Greece (0.88:1) had similar publication rates of educational and non-educational papers, on average.

Distribution of educational and non-educational publications by type from a national perspective

We also examined whether the educational publications were in journals or in other publication venues such as conference proceedings or book chapters (). The relevant Scopus document types were ‘article’, ‘book chapter’, ‘conference paper’, and ‘review’.

‘Review’ is a document type classification used by Scopus which tends to be applied to certain publications such as systematic and narrative literature reviews. However, it is important to note that application of this classification can differ by publication venue (for reasons Scopus does not make clear, this is a limitation). Some venues classify literature reviews as an ‘article’ while others classify it as a ‘review’. Publications of ‘review’ and ‘article’ were not combined, as it was considered important to see the number of ‘review’ studies separately. As a result, publications classified by Scopus as a ‘review’ were analysed and presented separately to articles.

shows the mean composition of individual authors’ educational publications from each country (not the entire country). This was calculated by determining the total number of relevant publications for each type (e.g. articles) by authors in the respective country, then dividing this by the number of authors.

shows that journal articles and conference papers are widely published by authors from all countries. In comparison to the overall number of journal articles and conference papers, the number of book chapters published by the authors is very limited. This demonstrates that the authors do not publish many book chapters, either in educational or non-educational fields.

Table 5. Distribution of publications by type, categorised by country. N represents the number of authors from that country.

Authors from Finland, Greece, Belgium, Ireland and Sweden have the highest number of educational-focused articles on average with 8.2, 7.3, 6.3, 6.1 and 6.0, respectively. Authors from Austria have the lowest number of educational-focused journal articles on average with 1.8.

Authors from Ireland, Finland, Belgium and Sweden have the highest number of educational-focused conference papers on average with 12.9, 11.7, 9.2, and 8.2, respectively. Authors from Italy have the lowest number of educational-focused conference papers on average with 0.7.

Authors from ten countries (Austria, Belgium, Denmark, Finland, Germany, Ireland, Netherlands, Portugal, Spain, Sweden) tend to publish more conference papers than articles which are educational-focused. In contrast, authors from Italy, Poland, Slovenia, and United Kingdom tend to publish more articles than conference papers which are educational-focused. The remaining countries (France, Greece, Slovakia) tend to publish educational-focused articles and conference papers at similar rates. This variation suggests that researchers from different countries may have very different approaches for publishing educational papers.

h-index of authors from each country

Finally we calculated how the educational and non-educational publications affected the authors’ overall h-index values ().

Table 6. Comparison of the h-index of all publications, only educational publications, and only non-educational publications, based on mean value of individual authors from each country. N represents the number of authors from that country.

The non-educational publications h-index was notably higher for the majority of countries. The only countries where the educational publications h-index was higher than the non-educational publications h-index were Greece, and Ireland. The h-index of educational and non-educational publications was broadly similar for Slovakia, Austria and Finland.

Discussion

Overall output

There are a number of sources providing comparative data on the scientific publication output of European Countries. column 3 shows the global scientific output for European countries according to Nature Index (Citation2019–2020) while column 4 has the output for Engineering Publications indexed in Scopus for Western European countries (Scimago Citation2021). Although there are some differences in the way samples and data are compiled and presented by these two sources, we see that overall the place of countries in these tables is relatively consistent. These data can be compared with the ranking of EE publications in our sample.

Table 7. Ranking of EE publications in Q1 journals with output in global scientific and engineering domains.

From and , we can see two notable phenomena:

  • Spain and the UK stand out in terms of the high number of authors published in EE although in the former only 10% were in Q1 journals.

  • The output of certain countries deviates from what we might have expected based on their position in global league tables that collate scientific and engineering (GSE) technical research publications in general. Germany, France and Italy have lower EER output compared with their position in GSE tables while Portugal Sweden, Finland, Denmark and Ireland are higher ().

We will now look in more detail at these findings at individual country level and consider explanations for patterns identified.

Spain

The high output from Spain () could be explained by the national regulations governing career progression that apply to all academics in public HE institutions (Valentine and Williams Citation2021a). The Spanish system involves 6-yearly evaluations based on an accumulation of ‘merits’ as defined nationally by the ANECA accreditation agency (Citation2020); these are quantitatively defined targets that favour publication in JCR-indexed journals. This can give rise to a publish or perish perception among engineering educators. Furthermore, as the requirements for merit accumulation via journal publication in educational journals are slightly lower than for specialised engineering journals, ambitious faculty may identify education journals as priority outlets for career-based reasons regardless of their commitment to education.

UK

Given the UK’s position in scientific publishing rankings such as in , it is not surprising that it appears as a high EER contributor. As in the case of Spain, the high UK output in journals () may also be the result of a national publish or perish policy in that all universities are subject to the Research Excellence Framework (REF), which is used to determine the allocation of ‘quality-related’ government research funding within the UK (Wint and Nyamapfene Citation2021) and takes place every 6 years. The REF leads to national league tables of HEIs and can have a major impact on the life of an engineering school. Its criteria prioritise publication in high impact journals.

Germany, France and Italy

Given their high rankings in GSE rankings (), it is surprising that these three countries are less well represented in EE authorship. This suggests that EE as a field is not seen as a priority by engineering faculty in these countries. We have not encountered an explanation for these findings and this suggests a need for more comparative research into the contexts of these countries within the European research sphere. For example, it would be of interest to establish how much research is published in national conferences and not in English as is the case of scholarship of teaching and learning papers in Germany – such studies would not show up in our Scopus analysis.

Nordic countries

We see in that these three countries tend to be higher in the EE ranking than their positions in the GSE columns. This trend aligns with the reported growth of a strong Community of Practice in these countries (Edström et al. Citation2018) following the creation in 2009 of NNEER (Nordic Network for Engineering Education Research) with funding from the research council of the Nordic Ministerial Council (NordForsk). The vitality of EE in this region may also be a reflection of the fact that there are more PhD programmes in EE (5) available in these three countries than in the rest of Europe combined (2) according to the online list maintained by the American Society for Engineering Education's Student Division.Footnote1

Portugal and Ireland

These two countries appear higher in our sample than their GSE ranking might suggest ().

In the case of Ireland, although the number of authors in our sample was relatively small (17), 13 out of 17 (76%) of the authors from Ireland had published in Q1 journals during 2018–2019 () which suggests there has been a focus on EER there. This aligns with data presented in a recent paper (Wint et al. Citation2022) that suggests a number of institutions clearly support PhD research projects in engineering education and that publications in the field count towards promotion.

In the case of Portugal, there have been two previous publications:

By contrast with Spain and the UK, although Portuguese faculty are subject to evaluation processes, these are more loosely defined and allow more flexibility at the level of individual higher education institutions (Valentine and Williams Citation2021a).

It is interesting to compare the data in for 2018–2019 with that in the earlier study (van Hattum-Janssen, Williams, and Nunes de Oliveira Citation2015) that looked at Portuguese EER publication in the period 2000–2012. That study reported an increase in output in the years 2011 and 2012 when there was an average of 13 publications per annum. The majority of these however were in non-Quartile 1 journals. The authors of that paper concluded that EER research in Portugal was evolving positively and indeed if we now compare the earlier data with the output from Portugal in we see that this trend has continued in quantitative terms, annual output has almost doubled and that this has also resulted in significantly more publications in Quartile 1 journals.

In the 2015 article, the authors noted that with regard to EE in Portugal there was ‘little structural support and the financial support received for such research has been modest’. They also commented that there were no existing PhD programmes in the area. To the best of the present authors’ knowledge, this situation has not altered in the intervening years and this suggests the question: if research in EE receives little structural support or funding in Portugal, why has it been on the increase? One possible contributory factor was the founding in 2011 of SPEE, the Portuguese Society for Engineering Education. This organisation initiated a biennial international conference (CISPEE), organises an in-person forum for engineering educators, liaises with local, national and international decision-makers and publishes a quarterly newsletter. This activity may have played a part in nurturing EER in the country but more detailed analysis is called for in future work.

Moving on to focus on the data in , we again encounter notable patterns when we compare the countries.

Educational ratio

presents the ratio of educational to non-educational publications for the entire careers of the authors identified in . The reader can see in Appendices 1a and 1b a random sample of 100 exemplar papers generated by an aleatory algorithm from the 39,322 publications in our sample. It will be noted that the non-educational papers are typically technical engineering papers.

In general, the authors in our sample produced more non-educational articles than educational ones in their careers (), Ireland being the only exception. This is particularly noticeable for countries like Spain, UK, France, Italy and Germany all of whom are in the upper region of publication ranking tables for Global Science and for Engineering output in .

Articles versus conference papers

shows that there appears to be quite a range in the priority assigned to conference papers: whereas authors from the majority of the countries in our sample published more in journals, it is notable that authors in five countries, Austria, Germany, Ireland, the Netherlands and Sweden published more conference papers than journal articles. This could be a reflection of the academic evaluation processes in the different countries, with journal articles given a high priority in countries like Spain and the UK while less rigid evaluation procedures in other countries can serve to encourage educators participation in conferences for their value in sharing ideas, learning and networking.

The trends for educational publications noted in are also apparent, broadly speaking, in the case of non-educational publications.

h-index values

In we see that there is a general trend for non-education publications to influence h-index values more than educational ones, Greece, Ireland and Slovakia being the only countries where educational publications have a strong influence.

This reflects a generalised phenomenon that was noted in the 1970s by citation analysis pioneer Ernest Garfield – founder of the ISI system and credited with being the initiator of the journal impact factor concept – when he observed that ‘citation potential can vary significantly from one field to another’ (Garfield Citation1979). In general, engineering education articles tend to have much lower citation rates than those in specialised engineering fields. This can be seen in the impact factor of journals: for example, the most cited journal in the field of EE, Journal of Engineering Education, has a 2020 impact factor (IF) of 3.146, the highest across all education fields is the Review of Educational Research with an IF of 13.5, while those of the three highest ranked in the field of Mechanical Engineering are Nature Materials 43.84, Materials Science and Engineering: R: Reports 36.21 and Advanced Materials 30.85 (Valentine and Williams Citation2021c).

Overall patterns

The author’s experience in sharing these comparative data on publication patterns during workshops and meetings in Europe has been that engineering educators and researchers are eager to get a picture of how their own national context compares with that of other countries and that there are relatively few sources of empirical data available.

We believe that the data and analysis presented here will help clarify the current state of the art in the countries featured in our sample. The main takeaways are that there is a considerable range in publication patterns across the EU + UK, Spain and the UK being leaders with regard to the quantity of journal publications. At the same time, there are countries such as Germany, France and Italy, that feature highly in the ranking of engineering and scientific publications but appear further down the ranking of engineering education output.

The importance given to education publications versus non-educational ones also varies considerably. The trend for non-education publications (typically technical ones) to contribute more to the h-index value of an author will also be an important consideration for some researchers.

Limitations

The most significant limitation of the present study is the fact that our sample of authors was gathered from those who published in (at least) one of thirteen EER journals over a two-year window between 2018 and 2019 (inclusive). This was due to the time required to download information about the 39,322 papers in our sample from SCOPUS. Future iterations of the procedure will allow us to amplify our sample to include broader time intervals.

A second limitation arises from the fact that for technical reasons (of the Scopus API) the author affiliations are assigned based on what countries Scopus reports they are currently affiliated to on 29 September 2021. So authors who moved country between 2018 and 2021 could be incorrectly assigned. However, we believe there is a sufficiently low inter-country mobility to allow us to believe that the majority of the 895 authors in our sample are correctly attributed.

Machine analysis of text data is always open to a level of error but, as shown earlier, manually checking 670 papers showed a high level of accuracy for the procedure adopted. We see qualitative data collection on regulations and policy at national level as an important complement to the quantitative approach adopted here.

A different type of limitation arises when attempting to propose hypotheses regarding the link between our findings and the EER context of the countries studied. As much of the contextual information at national level is not available in English we looked to existing research publications to provide such information. As we explain in the literature review, there is little published material available to provide background information on regulations and policy in each of the EU countries with respect to EER.

Future research

The authors have identified and initiated two future lines of investigation.

  • recognising that there is a need for more contextual information, we aim to collect data on the EER context of the countries included in the present study by sharing the current data via international events such as conferences and webinars and inviting contributions from colleagues in the various countries. This can involve qualitative approaches such as interviews and focus groups.

  • in the longer term, we intend to use the data presented here as a baseline to enable longitudinal studies of EER evolution at country level based upon the scientometric procedures presented in the present paper.

Conclusions

This paper presents a scientometric procedure that allows comparison of the engineering education publication in different countries. The procedure is implemented to compare European countries by taking a snapshot of researchers published in EER journals over a two-year period – 895 in all. This allowed us to compare the publication output from each country. We then analysed the entire publication careers of these authors to see how many of their publications were educational and how this impacted their h-index values.

Looking first at the output in the period 2018–2019 we note wide disparities between the various European countries and while we propose possible explanations for some of these, findings for other countries and Germany, France and Italy in particular await analysis in a future study.

Countries also varied widely in the ratio of educational to non-educational publications. In addition, our findings suggest that non-educational publications had a greater impact on authors’ h-index values than their educational output and this is in keeping with literature relating to other fields of inquiry.

We believe this approach can be valuable in future studies and that it can have a role in longitudinally mapping and comparing engineering education publication in and across Europe and can provide a valuable resource for policy-makers and researchers in general, but in particular future doctoral candidates, in the region by facilitating an evidence-based approach to mapping what is being done and what may need to be done to increase the quality of engineering education across the EU + UK region.

Acknowledgement

This work is partially financed by Portuguese funds through the FCT - Foundation for Science and Technology, I.P., under the project UIDB/00097/2020.

Disclosure statement

No potential conflict of interest was reported by the author(s). The second author was one of the 895 authors in the data set studied.

Additional information

Funding

This work is partially financed by Portuguese funds through the FCT – Foundation for Science and Technology, I.P., [project UIDB/00097/2020].

Notes on contributors

Andrew Valentine

Andrew Valentine is a Teaching Fellow in the School of Computing and Information Systems and the University of Melbourne, Australia. He initially completed a dual degree in electronics engineering and computer science at RMIT University, where he subsequently completed a PhD in engineering education. He was a postdoctoral research fellow at the University of Western Australia, and then a lecturer at the University of Queensland before taking up his current position. His primary research interests are fostering creativity skills in engineering, novel educational technologies and technology enhanced learning, scientometrics, development of curricula, and building students’ employability skills .

Bill Williams

Bill Williams originally trained as a chemist at UCC, Ireland and went on to work in education in Ireland, UK, Eritrea, Kenya, Mozambique and Portugal. He serves as an associate editor of the European Journal of Engineering Education and senior associate editor for the Journal of Engineering Education. He is a researcher at the Centre for Management Studies (CEGIST) of Instituto Superior Técnico, University of Lisbon, is Professor Emeritus of Setúbal Polytechnic Institute, Portugal and Adjunct Senior Research Fellow at TUDublin, Ireland. He is a founder member of the Portuguese Society for Engineering Education (SPEE).

Notes

References

  • Akin, A., and C. O. Güzeller. 2022. “The 500 Most-Influential Articles in Mathematics Education Research for the Period 1970–2020: A Bibliometric Citation Analysis.” Journal of History School 15 (59): 2241–2270. https://doi.org/10.29228/joh.58195.
  • ANECA. 2020. Criterios Academia 2020b_Ingeniería y Arquitectura_02. http://www.aneca.es/content/download/15233/187703/file/Criterios%20Academia%202020b_Ingenier%C3%ADa%20y%20Arquitectura_02.pdf.
  • Apiola, M., M. Saqr, S. López-Pernas, and M. Tedre. 2022. “Computing Education Research Compiled: Keyword Trends, Building Blocks, Creators, and Dissemination.” IEEE Access 10:27041–27068. https://doi.org/10.1109/ACCESS.2022.3157609.
  • Borrego, M., and J. Bernhard. 2011. “The Emergence of Engineering Education Research as an Internationally Connected Field of Inquiry.” Journal of Engineering Education 100 (1): 14–47. https://doi.org/10.1002/j.2168-9830.2011.tb00003.x.
  • Borrego, M., and B. Olds. 2011. “Analysis of Trends in United States National Science Foundation Funding for Engineering Education: 1990–2010.” In Research in Engineering Education Symposium, 175–183. Madrid, Spain.
  • Creswell, J. W., and J. D. Creswell. 2017. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. Sage publications.
  • Dart, S., and S. Cunningham-Nelson. 2020. “Seventeen Years of Australasian Engineering Education Research: Trends from Textual Analysis.” In Proceedings of the 31st Annual Conference of the Australasian Association for Engineering Education (AAEE 2020). Sydney, Australia.
  • Donthu, N., S. Kumar, D. Mukherjee, N. Pandey, and W. M. Lim. 2021. “How to Conduct a Bibliometric Analysis: An Overview and Guidelines.” Journal of Business Research 133:285–296. https://doi.org/10.1016/j.jbusres.2021.04.070.
  • Edström, K., A. Kolmos, L. Malmi, J. Bernhard, and P. Andersson. 2018. “A Bottom-Up Strategy for Establishment of EER in Three Nordic Countries – the Role of Networks.” European Journal of Engineering Education 43 (2): 219–234. https://doi.org/10.1080/03043797.2016.1190956.
  • Fensham, P. J. 2004. Defining and Identity: The Evolution of Science Education as a Field of Research. New York: Springer.
  • Froyd, J. E., and J. R. Lohmann. 2014. “Chronological and Ontological Development of Engineering Education as a Field of Scientific Inquiry.” In Cambridge Handbook of Engineering Education Research, edited by Aditya Johri and Barbara M. Olds, 3–26. New York: Cambridge University Press. http://doi.org/10.1017/CBO9781139013451.003.
  • Garfield, E. 1979. “Is Citation Analysis a Legitimate Evaluation Tool?” Scientometrics 1 (4): 359–375. https://doi.org/10.1007/BF02019306.
  • Hall, M. 2011. “Publish and Perish? Bibliometric Analysis, Journal Ranking and the Assessment of Research Quality in Tourism.” Tourism Management, Elsevier Ltd 32 (1): 16–27.
  • Haustein, S., and V. Larivière. 2015. “The Use of Bibliometrics for Assessing Research: Possibilities, Limitations and Adverse Effects.” In Incentives and Performance, edited by I. M. Welpe, J. W. S. Ringelhan, and M. Osterloh, 121–139. Cham: Springer.
  • Klassen, M., and J. M. Case. 2022. “Productive Tensions? Analyzing the Arguments Made About the Field of Engineering Education Research.” Journal of Engineering Education 111 (1): 214–231. https://doi.org/10.1002/jee.20440.
  • Lima, Rui M., and Diana Mesquita. 2018. “Engineering Education (Research) in European Countries – an Overview Based on Publications in Journals.” In 2018 3rd International Conference of the Portuguese Society for Engineering Education (CISPEE), 1–6. Aveiro, Portugal: IEEE.
  • López-Pernas, S., M. Saqr, and M. Apiola. 2023. “Scientometrics: A Concise Introduction and a Detailed Methodology for Mapping the Scientific Field of Computing Education Research.” In Past, Present and Future of Computing Education Research: A Global Perspective, edited by M. Apiola, S. López-Pernas, and M. Saqr, 79–99. Cham: Springer.
  • Nature Index. 2019–2020. Accessed January 27, 2022, https://www.natureindex.com/country-outputs/generate/all/Europe.
  • Neto, P., and B. Williams. 2017. “The European Journal of Engineering Education as a Venue for Engineering Education Research Publication: A Meta View.” In 45th SEFI Conference, Azores, Portugal.
  • Nyamapfene, A., and B. Williams. 2017. “Evolution of Engineering Education Research as a Field of Inquiry in the UK.” In 7th Research in Engineering Education Symposium (REES 2017): Research in Engineering Education, Bogota, Colombia.
  • Rose, M. E., and R. K. John. 2019. “Pybliometrics: Scriptable Bibliometrics Using a Python Interface to Scopus.” SoftwareX 10:100263. https://doi.org/10.1016/j.softx.2019.100263.
  • Scimago. 2021. Country Rank for 2020. Accessed January 15, 2022. https://www.scimagojr.com/countryrank.php?area = 2200®ion = Western%20Europe&order = itp&ord = desc&year = 2020.
  • Scopus. 2022. What are the Most Used Subject Area Categories and Classifications in Scopus? Accessed June 28, 2023. https://service.elsevier.com/app/answers/detail/a_id/14882/supporthub/scopus/~/what-are-the-most-frequent-subject-area-categories-and-classifications-used-in/.
  • Towne, L., and R. J. Shavelson. 2002. Scientific Research in Education. National Academy Press Publications Sales Office.
  • Valentine, A. 2020. “Do Australian Engineering Education Researchers Publish more Educational or Non-Educational Research? A Bibliometric Analysis.” In 31st Annual Conference of the Australasian Association for Engineering Education (AAEE 2020): Disrupting Business as Usual in Engineering Education: Disrupting Business as Usual in Engineering Education, 471–478. Barton, ACT: Engineers Australia.
  • Valentine, A., and B. Williams. 2021a. “Evolution of Engineering Education Research in Portugal and Spain: A Scientometric Study.” In 2021 4th International Conference of the Portuguese Society for Engineering Education (CISPEE). Lisbon, Portugal: IEEE.
  • Valentine, A., and B. Williams. 2021b. “Engineering Education Research in the Nordic Countries: Scientometric Insights into Publication and Career Patterns.” In Proceedings of the 49th Annual Conference of the European Society for Engineering Education (SEFI). SEFI: Berlin, Germany.
  • Valentine, A., and B. Williams. 2021c. “Engineering Education and Non-Education Research: A Scientometric Comparison of 7 Countries.” In REES AAEE 2021 Conference: Engineering Education Research Capability Development: Engineering Education Research Capability Development, 755–764. Perth, WA: Engineers Australia.
  • van Hattum-Janssen, N., B. Williams, and J. M. Nunes de Oliveira. 2015. “Engineering Education Research in Portugal, an Emerging Field.” International Journal of Engineering Education 31 (2): 674–684.
  • Wang, S., Y. Chen, X. Lv, and J. Xu. 2023. “Hot Topics and Frontier Evolution of Science Education Research: A Bibliometric Mapping from 2001 to 2020.” Science & Education 32 (3): 845–869. https://doi.org/10.1007/s11191-022-00337-z.
  • Williams, B., and M. Alias. 2011. “Strategic Pathways to Engineering Education Research: Case Study of a Top-Down Initiative.” In Proceedings of the Research in Engineering Education Symposium (REES 2011). Madrid, Spain.
  • Williams, B., P. C. Wankat, and P. Neto. 2018. “Not so Global: A Bibliometric Look at Engineering Education Research.” European Journal of Engineering Education 43 (2): 190–200. https://doi.org/10.1080/03043797.2016.1153043.
  • Wint, N., M. Murphy, A. Valentine, and B. Williams. 2022. “Mapping the Engineering Education Research Landscape in Ireland and the UK.” In 50th Annual Conference of the European Society for Engineering Education, edited by Hannu-Matti Järvinen, Santiago Silvestre, Ariadna Llorens, and Balàzs Nagy, 862–871. Barcelona: Universitat Politècnica de Catalunya.
  • Wint, N., and A. Nyamapfene. 2021. “Perspectives on Engineering Education Research in the UK: What Is Being Done, Why, and for Whom?” In Research in Engineering Education Network & Australasian Association for Engineering Education Conference. Berlin, Germany.
  • Wint, N., B. Williams, A. Valentine, and M. Murphy. 2023. “Mapping The Engineering Education Research Landscape Across Europe.” In Proceedings of the 51st Annual Conference of the European Society for Engineering Education (SEFI). Dublin, Ireland: SEFI.
  • Zupic, I., and T. Čater. 2015. “Bibliometric Methods in Management and Organization.” Organizational Research Methods 18 (3): 429–472. https://doi.org/10.1177/1094428114562629.

Appendices

Appendix 2

Overall number of EE authors and publications included in the study

Appendix 3

Number of publications from each country published in Q1 engineering education journals in 2018–2019 (based on Scimago rankings).

Appendix 4

Number of total authors from each country who published in the 13 engineering education journals in 2017–2020 (based on the authors location at the time of publication) (duplicate authors in each year not removed).