2,279
Views
2
CrossRef citations to date
0
Altmetric
Research Article

A taxonomy of common engineering activities and competencies

ORCID Icon, , ORCID Icon &
Pages 181-193 | Received 14 Jul 2021, Accepted 23 Apr 2023, Published online: 27 May 2023

ABSTRACT

In this paper, we address the lack of a unified approach to understanding engineering practice by developing and presenting a taxonomy of common engineering activities. The taxonomy consists of 86 common engineering activities linked to 17 engineering competencies and the 11 International Engineering Alliance graduate attributes. The list of activities was developed using a six-step process, including multiple systematic literature searches and surveying engineers. The taxonomy provides a critical foundation for better understanding what engineers do, particularly in the Australian and New Zealand context. The taxonomy has potential utility in both engineering practice research and engineering education curriculum reform.

1. Introduction

Engineering practice research focusses on understanding what engineers do and their work contexts (Sheppard et al. Citation2009; Stevens, Johri, and O’Connor Citation2014). The work that engineers do constantly changes, shaped by socio-economic and technological factors (Frey and Osborne Citation2013). There is ongoing interest in understanding these changes, with a focus on ensuring that engineering curriculum adapts in response (Australian Council of Engineering Deans Citation2019). Despite this interest, the empirical evidence on engineering practice, including how practice is changing, is sparse (Mazzurco et al. Citation2021; Trevelyan Citation2019). This lack of empirical evidence on changes to engineering practice limits the ability for engineering educators to develop responsive curriculum.

Cross-sectional studies (e.g. (Pons Citation2015)) provide insight into differences in engineering practice at a particular time point, for example, the practices of graduates compared with more experienced engineers. In contrast, longitudinal studies allow for more comprehensive and higher-quality evidence for the direction and magnitude of change over time, particularly when participant data are paired. There have been some longitudinal studies on engineering practice (Ashforth, Sluss, and Saks Citation2007; Boxall and Steeneveld Citation1999; Brunhaver et al. Citation2015, Citation2018; Trevelyan and Tilli Citation2008; Western et al. Citation2006). However, most of these studies focussed on the higher education to work transition and terminated within five years of initiation, providing limited insights into long-term changes to engineering practice.

The research reported within this paper is part of a broader longitudinal cohort study; the BeLongEng Project, which uses repeated measures of engineers from Australia and New Zealand to study changes in engineering practice.

The survey used in the BeLongEng Project measures a broad range of career determinants and outcomes (Patton and McMahon Citation2014), including demographics (e.g. gender, age, ethnicity, job characteristics, income), factors that affect engineering practice (e.g. institutional support and standing, work-life factors, changes to technology, self-belief, identity), and engineering activities. In this paper, we describe how we developed a taxonomy of common engineering activities applicable to engineers from different disciplines and experience levels, linked with engineering competencies.

There are a myriad of frameworks which classify the work that engineers do. These include engineering bodies of knowledge (e.g. (American Society of Civil Engineers Citation2019)), and accreditation standards (e.g. (ABET Citation2019; Engineers Australia Citation2019; International Engineering Alliance Citation2021)). These frameworks often classify elements of practice into higher-level outcomes or competencies. The definition of competency is contested (Shippmann et al. Citation2000), but for this study, we adopt a common definition that a competency is the ‘knowledge, skills, abilities, and other characteristics (KSAO’s) that are needed for effective performance in the job in question’ (Campion et al. Citation2011, 226). Competencies represent broad KSAO’s required for job performance and are underpinned by activities (Passow and Passow Citation2017); discrete elements of work that people do. The relationship between competencies and activities is complex; one activity can be associated with multiple competencies, and one competency can be demonstrated through different activities.

Competency frameworks provide preliminary guidance on higher-level outcomes for engineering activities, recognising that these competencies are often focussed on engineers with different experience levels. Competency frameworks are useful because they simplify the KSAO’s expected of an engineer. Competency frameworks often lack comprehensive lists of engineering activities applicable to multiple engineering disciplines. This lack of prescription limits the ability to use these competency frameworks to study engineering activities.

In 2017, Passow and Passow synthesised 27 quantitative studies, 25 qualitative studies, and thousands of job postings to identify the competencies needed for graduate engineers. This systematic review included studies of engineering graduates who no longer worked in engineering roles (e.g. (Brunhaver Citation2015)). An outcome of Passow and Passow’s review was a generic list of 16 engineering competencies, mapped to the 2013 Washington Accord (International Engineering Alliance, Citation2014) and ABET competencies, which the authors claim captures the ‘essence of engineering practice’ (Passow and Passow Citation2017, 500). This claim indicates that the competencies might be suitable for integration in a taxonomy of engineering practice.

The study of engineering activities is complicated by a suite of factors, including a lack of a unified framework for engineering practice (Trevelyan Citation2014), the sheer number of different engineering activities (e.g. in bodies of knowledge), complexities in developing a classification system, differences in activities associated with varying work contexts (e.g. disciplines, region, tenure), and in the case of ethnographic research, access to engineering workplaces (Jesiek et al. Citation2020; Stevens and Vinson Citation2016).

Despite these challenges, there remains a significant opportunity to develop a unified taxonomy for studying common engineering activities. The guiding research question for this paper is ‘In order to assess what engineers do in practice, can we identify and classify common engineering activities and link these to broader engineering outcomes?’

In this paper, we describe how we developed an engineering practice taxonomy, comprising a list of common engineering activities paired with competencies and the International Engineering Alliance’s (IEA) eleven graduate defining characteristics (GDC’s) (Version 2021.1, 2021). The IEA GDC’s are the stem for the graduate attributes of the Washington, Sydney and Dublin Accords (WA, SA and DA, respectively) that relate to engineers, engineering technologists, and engineering technicians. We discuss the challenges we encountered in identifying engineering activities, the utility of the taxonomy, limitations, and future work. Our taxonomy provides a unified classification of common engineering activities and competencies, applicable to a broad range of engineering disciplines and experience levels in at least the Australian and New Zealand context. Engineering practice researchers will be able to use this taxonomy as a basis to study what engineers do in practice. Engineering educators can also use the taxonomy to identify activities that engineering students can do in the higher education context to demonstrate expected graduate attributes.

2. Methods

This section describes the methods for developing the list of common engineering activities, pairing of the activities to the engineering competencies and modification of the competencies, and the pairing of the activities to the IEA defining characteristics.

2.1. Developing the list of common engineering activities

We used a six-step process to develop the list of common engineering activities, summarised in (see Results). This process started with a systematic search of the engineering practice literature to identify an initial list of engineering activities (Step 1). We then extended the list using the outcomes of a pilot survey (Step 2). Following this, we revisited the engineering practice literature to identify other lists of engineering activities to add to our list (Step 3). We then reviewed qualitative and mixed-method engineering practice literature to find additional engineering activities (Step 4). We then reduced the number of activities in this list to de-duplicate similar items (Step 5). Finally, we used another survey to identify any other remaining activities (Step 6). The detailed method for each step is provided later in this section. During each step, we applied the following exclusion criteria (E1 to E6) to exclude activities which:

Figure 1. Schematic of results from the six step process used to develop the list of common engineering activities. The relationship with the competency mapping is also shown. A total of 1,206 activities were identified, 86 of which remained after condensing and application of the exclusion criteria (E1 to E6). These 86 common engineering activities form the basis of the taxonomy.

Figure 1. Schematic of results from the six step process used to develop the list of common engineering activities. The relationship with the competency mapping is also shown. A total of 1,206 activities were identified, 86 of which remained after condensing and application of the exclusion criteria (E1 to E6). These 86 common engineering activities form the basis of the taxonomy.
  • E1. Did not include a verb. Example exclusions were ‘economic issues’ and ‘visualisation’ (Robinson Citation2010).

  • E2. Were passive or inactive. Example exclusions for this were ‘ignore …’ (Daka and Fraser Citation2014) and ‘receiving information’ (Robinson Citation2010).

  • E3. Only described thought processes, for example ‘… consider …’ (Brunhaver Citation2015) or ‘taking into account …’ (Trevelyan Citation2008).

  • E4. Described employment-seeking or promotion activities, for example ‘Apply for positions outside organisation, register with employment agencies, maintain current résumé and portfolio of achievements’ (Trevelyan Citation2008).

  • E5. Only described the context of an activity, e.g. volunteering.

  • E6. Duplicated an existing activity.

We excluded thought processes, as these could not be readily partitioned from other activities; thinking happens all the time. We excluded the employment-seeking and promotion activities on the principle that such activities would not be described in an employment position description. When adding activities to the list, we truncated the context, mode, thought element and/or purpose of the activities. For example, ‘Searching for information on the internet, in databases, filing system etc’. (Williams and Figueiredo Citation2011) was truncated to ‘Searching for information’. We changed these activity descriptions because coupling modes and/or the end purposes of individual activities would have been too detailed and exhaustive. To prepare the list of activities for use in the surveys (Steps 2 and 6), we changed the language of some activities to a past-participle form to improve usability. For example, the activity ‘Analyse business or financial data’ was modified to ‘Analysing business or financial data’.

2.1.1. Step 1 Identify an existing list of engineering activities

As a starting point and accounting for our project scope, we needed to identify an existing list of common engineering activities that was empirical, and that could be applied to any career stage and discipline. To find this list, we used a systematic literature search. We did not conduct a systematic review. Systematic reviews focus on identifying, critically evaluating, and synthesising outcomes from research (Borrego, Foster, and Froyd Citation2014). In contrast, we only needed to identify engineering activities that are reported or mentioned in research. Despite us not completing a systematic review, we aligned our systematic search strategy with accepted systematic review methods, specifically the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework (Page et al. Citation2021). As per the PRISMA framework, we have documented our:

  1. Eligibility criteria

  2. Information sources

  3. Search strategy

  4. Selection process

  5. Data collection process

  6. Data items

We used five search strategies for published literature (P1 to P5) and three search strategies for grey literature (G1 to G3), summarised in . Grey literature has no agreed definition, but typically relates to documents that are not indexed by academic bibliographic databases. Examples of grey literature include government or organisation information, magazines, blogs, consultancy reports and other websites (Mahood, Van Eerd, and Irvin Citation2014).

Table 1. Summary of search strategies to identify lists of activities that engineers do.

Search strategy P1 used Mazzurco et al. (Citation2021) reference list, which was developed using screening criteria applied to a database search, reference list search, and a review of other systematic reviews and bibliographies, which are often referenced by engineering practice researchers (e.g. Brunhaver et al. (Citation2018)). Further details of these strategies and criteria are reported by Mazzurco et al. (Citation2021). Mazzurco et al.’s study was limited in its date range, the types of publications limitations, and the omission of an author-search strategy. We addressed these limitations by using additional searches, P2 to P4. We included a citation search, P5, to identify additional studies not identified in P1 to P4. We used Scopus for P2 to P5, due to its extensive indexing of titles and abstracts across multiple disciplines. Scopus uses lemmatisation (grouping of similar words into a single item) and equivalency to account for plurals, wildcards, plural and adjectival, and similar word forms, allowing for simpler search strings. For example, the use of the search term ‘engineer’ in Scopus will account for ‘engineer’, ‘engineers’ and ‘engineering’, and ‘practice’ will account for ‘practice’, ‘practise’ and ‘practicing’. Similarly, Scopus automatically accounts for field variants in author searches, including initials in the place of given names. For the author search (P4), the list of names was based on researchers who were listed as first-author at least twice either from Mazzurco et al.’s list or the Brunhaver et al. (Citation2018) bibliography on engineering practice, or who was identified by the present study’s first author as having studied activities in engineering practice.

For searching the grey literature (G1 to G3), we used Google Advanced Search and Google web-browsing. Results from the Google Advanced Searches were screened according to the title and the short text underneath each title. For the web-browsing, we browsed official websites of each government body for surveys, questionnaires, data, or statistics relating to engineering activities. A forward citation search on the grey literature was not possible due to these outputs not being indexed by bibliographic databases.

We applied five screening criteria to the identified outputs’ abstracts and, for those that passed the abstract screening, to the full outputs. When a list of activities was not available in text, we contacted the corresponding author to request the associated list. The five inclusion (I1 to I5) criteria were:

  • I1. Engineering activities must be listed (or tasks, not competency, knowledge or skill-based).

  • I2. The list must have been developed using empirical engineering practice research (not student-based activities, not theoretical frameworks, nor conjecture).

  • I3. The list of engineering activities must be applicable to different engineering disciplines.

  • I4. The list of engineering activities must have applicability for different experience levels.

  • I5. There must be evidence of content validity checking or survey pre-testing (Boateng et al. Citation2018).

We included the criteria of content validity or survey pre-testing to focus on research with a higher degree of rigour. We removed this restriction when we broadened the search strategy (Step 3), in recognition of that lack of validity checking or survey pre-testing does not necessarily mean a particular activity is not valid. To ensure the appropriate application of these criteria, the present study’s second author reviewed 10% of the returned outputs against these criteria. Following this screening, we chose one initial engineering activities list – termed the Step 1 list – as starting point for Step 2.

2.1.2. Step 2. Extend list using pilot survey

A pilot survey for the BeLongEng Project was run in late 2021 to test and improve the survey. The pilot survey was granted ethics approval by the University of Canterbury Human Ethics Committee (HREC Reference 2021/41). Recruitment channels for the pilot survey included professional connections of researchers from the BeLongEng Project and social media advertising. Participants opted into the research through informed consent, utilising a participant information sheet. There were 40 pilot study participants, all of whom held engineering qualifications. The qualification level varied from an Advanced Diploma to PhD. In total, 57% of the pilot study participants resided in Australia and 43% resided in New Zealand. Women represented 52.5% of the sample. The average age of participants was 38. The engineering disciplines nominated by the pilot study participants were primarily civil, project management, environmental, transportation and management. Further demographics of the pilot study participants are reported elsewhere (Richards Citation2021). The pilot survey included questions relating to the engineering activities from the Step 1 list, the outcomes and analysis of which are beyond the scope of this paper. The survey included an open-ended question asking participants to identify any activities that were not covered by the engineering activity questions (i.e. the Step 1 list). We applied the exclusion criteria (E1 to E6) to the pilot study participants’ responses to the open-ended question before adding any new activities. This process created the Step 2 list.

2.1.3. Step 3. Extend list using other lists

Steps 1–2 may have omitted some common engineering activities. To mitigate against this risk, we reassessed the research outputs identified in Step 1 using fewer criteria (I1 to I3) to identify additional engineering activities. These criteria were applied to the abstracts and, for those that passed the abstract screening, to the full outputs. We then applied the exclusion criteria (E1 to E6) before adding activities. This process created the Step 3 list.

2.1.4. Step 4. Extend list using qualitative and mixed-method literature

The Step 3 list may have omitted engineering activities reported in qualitative and mixed-methods engineering practice research. To mitigate this risk, we reviewed 149 qualitative and mixed-method engineering practice publications, as listed by Mazzurco et al. (Citation2021, see List of Step 4 Publications in the online Supplementary Material), to extract engineering activities mentioned in the title, abstract and full text. These publications account for studying some 9,000 engineers Mazzurco et al. (Citation2021); the exact number is not determinable because multiple publications can based on one study. We then applied the exclusion criteria (E1 to E6). This process created the Step 4 list. We then applied the exclusion criteria (E1 to E6). This process created the Step 4 list.

2.1.5. Step 5. Condensed list

We condensed the activities in the Step 4 list by using two authors (author one and three) to independently identify similar engineering activities. The two authors worked together to reach consensus before removing similar items. Where there was disagreement on similarity, the activity was retained. No exclusion criteria were applied in this step. This process created the Step 5 list.

2.1.6. Step 6. Baseline survey

In Step 6, we prepared the list of common engineering activities from Step 5 for use in the baseline survey for the BeLongEng Project. The baseline survey was granted ethics approval by the University of Canterbury Human Ethics Committee with the list of engineering activities added as an amendment. Recruitment for the baseline survey cohort occurred from February to June 2022. The population of interest was formally recognised engineers from Australia and New Zealand. The formal participant criteria were:

  1. Participants must have

    1. a 2-, 3-, or 4-year engineering qualification from an Australian or New Zealand tertiary institution; or

    2. a postgraduate engineering qualification from Australia or New Zealand; or

    3. recognition as having equivalent standing to at least a graduate level through membership to a professional engineering society in Australia or New Zealand, or

    4. live in Australia or New Zealand and be eligible for membership to Engineers Australia or Engineering New Zealand, and

  2. Participants must expect to be working for at least the next 10 years.

Recruitment channels included advertising in engineering publications and social media, and emails to engineering alumni of 24 tertiary institutions in Australia and New Zealand. Participants opted into the research through informed consent, utilising a participant information sheet. The baseline cohort includes 889 participants. Participants were more likely to be women or female, from New Zealand, and be higher qualified, relative to the engineering population across Australia and New Zealand. Further demographics of the baseline participants are reported elsewhere (Crossin et al. Citation2022). As per the pilot survey described in Step 2, the baseline survey included an open-ended question to identify other engineering activities. Following the baseline survey, activities were extracted the freeform responses and were subject to exclusion criteria E1 to E6. This process created the final list of common engineering activities.

2.2. Pairing activities to competencies and competency modification

We paired the activities to competencies in two stages; Stage A, which followed Step 1 and Stage B, which followed Step 6. Stage A was used to identify gaps in, and subsequently modify, the competency descriptors, whilst Stage B was used to confirm the pairings. For Stage A, the first and third authors reviewed coverage of the activities list to Passow’s and Passow (Citation2017) sixteen competencies. The coverage was reviewed using a two-dimensional matrix, with the list of common engineering activities on one axis, and the 16 competencies on the other. Both authors independently paired each activity and competency. The two independent pairings from Stage A were then merged, and the outcomes were reviewed. The three possible outcomes for each activity-competency pairing were:

  1. Agreement. Both authors agreed that a given activity could not be paired with one of the competencies. In this case, agreement may be due to a potential gap in the coverage of the competency.

  2. Agreement. Both authors agreed that the activity could be paired with a competency.

  3. Disagreement. One author paired the activity to a competency, whilst the other did not. This could be a misinterpretation of a competency or activity by one of the authors.

For pairings with outcome 3, authors 1 and 3 debated the outcome to reach a consensus, representing either outcome 1 or 2. For pairings with outcome 1, the competency description was modified to account for the activities that could not be paired.

For Stage B, authors 1 and 3 independently created new activity-competency pairings, without referring to the original pairings (Stage A), by marking their initials (A1 or A3) in each cell where the author considered there to be a pairing. We then combined the two independent pairings, with the combined markings in each cell being either agreement of the activity-competency pairing (A1A3; both authors agreed), agreement to no pairing (blank) or disagreement on pairing (only one author paired the activity to a competency). We retained the disagreements in the pairings (see the Activity-Competency Map in the online Supplementary Material), as we recognise that the pairings were subjective. We calculated the inter-rater reliability of the activity and competency pairings using Cohen’s kappa coefficient.

2.3. Pairing activities to IEA defining characteristics

The IEA graduate attributes represent increasing levels of complexity of the work expected for an engineering technician (DA), engineering technologist (SA), and an engineer (WA). As previously described, the IEA attributes have a common graduate defining characteristic (GDC’s) as a stem. For example, DA4 (‘Conduct investigations of well-defined …’), SA4 (‘Conduct investigations of broadly-defined …’) and WA4 (‘Conduct investigations of complex …) are singularly related to the GDC of ‘Investigation: Breadth and depth of investigation’. The activity questions we use in the survey do not relate to complexity; asking questions relating to complexity would require anchor points of complexity for each and every activity. This meant that pairing to the IEA outcomes needed to be performed between the activities and GDC’s, rather than activities to DA, SA or WA attributes. To create the activity to IEA GDC’s pairing, we first transformed the activity-competency pairings where there was consensus between both authors (A1 and A3). To transform the pairing to the- IEA GDC’s, we first used Passow & Passow’s competency to Washington Accord Graduate Attribute (WA’s) mappings (Passow and Passow Citation2017, 493–495) as a guide. During this process, the WA attributes served as a proxy for the GDC’s. This proxy was appropriate because there can only ever be one WA for each GDC. The use of Passow and Passow’s mappings were based on the old Washington Accord (containing 12 attributes), compared with the current 11. To account for this, we paired activities aligned to the old WA 6 & 7 to the one updated GDC ‘The Engineering and The World’. In addition, some of the Passow and Passow competencies were mapped to multiple WA’s (e.g. ‘Define constraints is mapped to WA’s 3, 6 and 7). To account for this, we split the competency into multiple GDCs and copied the activity to these multiple GDCs. Authors 1 and 3 then independently reviewed the appropriateness of each activity to GDC pairing. These two independent reviews were then merged. We then debated the outcome to reach consensus on the activity to GDC pairings. We discarded any disagreements to limit the subjectivity of the activity to IEA GDC map.

3. Results

3.1. List of engineering activities

The results of the development of the list of common engineering activities are presented schematically in , described below.

During Step 1, we identified two lists of engineering activities that met the inclusion criteria; the activity system from the NET Development (Citation2021) and the list published by Trevelyan (Citation2008) (Trevelyan and Tilli Citation2008). The Trevelyan (Citation2008) list describes 85 engineering activities across 10 categories. In contrast, the O*Net system classifies activities into three tiers; elements, individual work activities, and detailed work activities. We collated the O*Net engineering activities for all engineering jobs in O*Net; this included 31 tier one, 136 tier two, and 304 tier three engineering activities. The third tier of O*Net captures discipline-specific activities, which did not align with our intent to identify common activities. In addition, we were mindful of the need to balance data resolution with the burden of surveying participants on hundreds of activities. In contrast, we considered that the tier one O*Net activities were unlikely to provide enough resolution for meaningful insights into practice. For example, the tier one activity of ‘Getting information’ is linked to eight tier two activities, including ‘Gather data about operational or development activities’ and ‘Investigate organisational or operational problems’. We considered that the tier two activities were a good balance between the resolution needed and likely participant burden. The O*Net database is consistently updated with emerging job activities. In contrast, Trevelyan (Citation2008) is not under ongoing review. We therefore chose the 136 tier two engineering activities from O*Net as the Step 1 list.

In Step 2, we received two open ended responses in the pilot study. These two activities were already listed and were identified as duplicates. Thus, Step 2 yielded no additional activities. In Step 3, we identified 225 activities from 12 outputs, with only 14 activities surviving the exclusion criteria. The Step 3 list contained 150 engineering activities. The additional engineering activities were identified from (Brunhaver Citation2015; Daka and Fraser Citation2014; Garousi et al. Citation2015, Citation2016, Citation2020; Garousi and Zhi Citation2013; Kreth Citation2000; Obstfeld Citation2005; Reychav and Aguirre-Urreta Citation2014; Robinson Citation2010, Citation2012; Trevelyan Citation2008; Williams and Figueiredo Citation2011).

In Step 4, the review of the mixed-method and qualitative literature identified 793 activities from 149 publications, with none of the activities surviving the exclusion criteria. Thus, this step yielded no any activities. In Step 5, we identified and removed 64 similar activities. This reduced the list from 150 to 86 activities. In Step 6, we received a total of 889 completed and submitted responses to the baseline survey, which included the freeform question about other engineering activities. Participants responded with 52 activities. None of these activities survived the exclusion criteria.

In summary, we identified 1,208 engineering activities in Steps 1–6, 86 of which were retained after the application of the exclusion criteria and condensing.

3.2. Modification of competencies and mappings

In Stage A, we identified that five activities could not be paired to the 16 Passow and Passow (Citation2017) competencies. The activities that could not be paired to the competencies related to interactions engineers have with physical equipment, resourcing activities for experiments, legal activities, directing and leading others, and providing advice to others. We modified five competency names and/or the competency descriptions of Passow and Passow (Citation2017) to allow for a relationship with these activities. We also created a new, seventeenth competency, ‘Provide advice to others’. The remaining eleven Passow and Passow (Citation2017) competencies were paired to the final list of engineering activities. A summary of the changes to the competencies are reported in .

Table 2. Modified Passow and Passow (Citation2017) competencies and competency descriptors. Modified or new wording are provided in Italics.

In Stage B, the level of observed agreement (p0) between the two authors was 86%, the level of chance agreement (pe) was 62%, which resulted in a Cohen’s kappa coefficient of 0.63, indicating a moderate or substantial level of agreement (McHugh Citation2012). There were two activities where consensus pairings to either the competencies or IEA GDCs was not reached; ‘Performing administrative or clerical activities (e.g. writing and responding to emails, scanning documents)’ and ‘Travelling to other work sites’.

3.3. Finalised taxonomy

The final result is an engineering practice taxonomy comprising a list of 86 common engineering activities, paired to 17 engineering competencies, and to the eleven International Engineering Alliance’s defining characteristics. The list of common engineering activities and engineering competencies are documented in the Appendix. The two mappings are available in the online Supplementary Material (see Activity-Competency Map and the Activity-IEA GDC Map). A sample of the finalised taxonomy is illustrated in .

Figure 2. Sample of the taxonomy, showing a) activity to competency author pairings (A1 and A3) and b) activity to IEA graduate attribute pairings (*).

Figure 2. Sample of the taxonomy, showing a) activity to competency author pairings (A1 and A3) and b) activity to IEA graduate attribute pairings (*).

4. Discussion

We have addressed the guiding research question by creating a taxonomy of engineering practice, which classifies common engineering activities. The activities are linked to broader engineering outcomes, specifically competencies and the International Engineering Alliance’s eleven differentiating characteristics, which can be directly linked in turn to the Washington, Dublin and Sydney Accords attributes.

In this section, we discuss some of the challenges in implementing the method, including the limitations of our literature search strategy, the outcomes of the surveys, and mapping the activities to the competencies and IEA GDC’s. We conclude the discussion by describing the utility of our taxonomy, limitations, and avenues for future research.

4.1. Identifying activities

The initial list of common engineering activities we selected in Step 1 (O*Net) was developed for a United States context. There was a risk that these activities were not appropriate for our focus on Australia and New Zealand engineers. To counter this risk, we verified the list against another one (in Step 2) from an Australian context (Trevelyan Citation2008). We could not repeat this verification for the New Zealand context, as the New Zealand-based engineering practice literature is limited to competency studies (Pons Citation2015).

The common engineering activities identified in the systematic literature searches (Steps 1 and 3) was limited by our method, including our search strategy, search terms used, database indexing, and the choice and application of inclusion criteria. These systematic searches may have missed engineering activities which are either tacit (experiential) or explicit (codified). Some of these activities have been codified in engineering bodies of knowledge, which are typically developed over an extended period of time with input from engineering experts.

When we extended our search beyond the restrictions of those imposed in the systematic literature search (Steps 1 and 3), including the qualitative and mixed-method literature search (Steps 4), and the two surveys of over 900 participants (Steps 2 and 6), we did not identify any additional activities to add. The inclusion of responses from New Zealand-based participants countered the limitation of our inability to verify the Step 1 list for a New Zealand context. In addition, the fact that we were not able to identify any additional activities in Steps 2, 4 and 6, indicates that we reached saturation in our search for engineering activities. This saturation, however, does not discount that other common engineering activities exist; we do not claim that we have captured all common engineering activities. We recognise that engineering and that the activities that engineers do will evolve. However, at this point in time, we consider that we have captured a representative list of common engineering activities appropriate for use in at least the Australian and New Zealand contexts.

4.2. Challenges with mapping activities to competencies

During the transformation from the activities-competency map to the activities-IEA GDC map, we occasionally encountered a lack of transferability between these two systems. For example, in Stage A we paired the ‘Maintaining systems, tools, equipment or structures’ activity to the ‘Define constraints’ competency. When we transformed this pairing in Stage B, we identified that the ‘Define constraints’ competency was originally mapped by Passow and Passow to the old Washington Accord attributes WA 3, 6, and 7, corresponding to the ‘Design/development of solutions’, and the, now superseded, ‘The engineer and society’ and ‘Environment and sustainability’ IEA GDC’s. However, when we checked the transformation, we found that we could not map ‘Maintaining systems, tools, equipment or structures’ to these IEA GDC’s. We corrected these discrepancies during Stage B. We also noted that Passow and Passow did not map their ‘Think creatively’ and ‘Take initiative’ competencies to the IEA GDC of ‘Design/development of solutions’ and ‘Individual … work’, respectively, even though creativity and functioning effectively as an individual are explicitly identified in the related WA attributes (WA3 & WA9, now revised to WA3 & WA8). This challenge of transforming mapping demonstrates some of the difficulties in defining competencies, and the transferability of descriptors between different frameworks. That some of the activity pairings did not transfer between the competency and IEA GDC maps does not limit the utility of this work; the large majority of activities were paired to at least one of the IEA GDCs.

The two activities that could not be paired to the competencies nor the IEA GDCs were ‘Performing administrative or clerical activities (e.g. writing and responding to emails, scanning documents)’ and ‘Travelling to other work sites’ could be considered enablers of, or peripheral to, the other engineering activities. For example, scanning documents is a likely precursor activity to ‘Reading documents or materials to inform work processes’. Nevertheless, these two activities are things that engineers do, and should be retained in the list.

We also noted that a number of engineering activities did not pair to neither the competencies nor the IEA GDC’s, especially activities relating to making decisions and providing advice to others. This lack of pairing raised the question as to whether or not these activities could be integrated into other higher-level outcomes, such as applying knowledge, design and lifelong learning, or if they were indeed related to separate outcomes. Interestingly, we identified that the International Engineering Alliance recognises that providing advice and making decisions are related to more experienced engineers through their professional competence profiles (International Engineering Alliance Citation2021). The lack of mapping to the IEA GDC’s for some of these activities and the inclusion of similar descriptions in the later professional competence profiles suggests that the IEA consider that graduates should not be expected to do certain elements of work. This finding also highlights a limitation of our activity to IEA GDC mapping; that its application is likely limited to graduates. The question then arises as to when graduates start to undertake the activities aligned to professional competence profiles. We intend to address this question in future research.

4.3. Utility

We have identified two main utilities for our engineering practice taxonomy. Firstly, the taxonomy is the foundation for studying the common activities that engineers do (e.g. through measurement of frequency and relevance, as described in Mazzurco et al. (Citation2021)). Such measurement will be useful to identify differences, similarities, criticalities, and changes in the activities that engineers do across different contexts and variables, such as disciplines, experience level, regions and time.

The taxonomy provides more specificity on the activities associated with different engineering outcomes. This specificity provides a basis for the second main utility of the taxonomy; identifying which activities could be used in a higher education setting to demonstrate attainment of graduate attributes. For example, an engineering academic might want to identify an engineering activity associated with Washington Accord attribute WA11 ‘Recognize the need for, and have the preparation and ability for i) independent and life-long learning ii) adaptability to new and emerging technologies and iii) critical thinking in the broadest context of technological change’. Using our taxonomy, the academic would match this attribute with the associated graduate defining characteristic, in this case ’Lifelong learning: Duration and manner’. The academic would then search the taxonomy for activities which are linked with this outcome. In this case, the taxonomy includes four common activities; 1. ‘Maintaining knowledge through continual professional development’, 2.‘Managing risks associated with work activities, including health and safety, commercial, environmental etc.’, 3’.Mentoring others’ and 4. ‘Planning work activities, projects, programs or events’. The academic could then design assessment tasks to integrate these real engineering activities, focussing on ethical elements within these activities, providing a verifiable link between what engineers do and assessment. As reported in the introduction, however, activities can be linked to multiple competencies. The academic could then further identify which other graduate attributes are likely associated with these activities. For example, ‘Maintaining knowledge through continual professional development’ is also paired with the ‘Engineering Knowledge’ graduate defining characteristic, which is subsequently linked to WA1, DA1 and SA1.

4.4. Limitations

The list of engineering activities does not cover discipline-specific activities, for example finite element modelling in mechanical engineering, or geological characterisation in civil engineering. The utility of the taxonomy is potentially limited to an Australian or New Zealand context. When using the taxonomy, researchers from other regions should consider the appropriateness of the taxonomy, relative to their context. The list of common engineering activities do not describe the complexity of tasks. This limitation means that care should be taken when extrapolating the activities to complexity related graduate attributes, such as those in the Washington, Dublin and Sydney Accords. Our mappings we created were subjective; we countered this subjectivity by using multiple authors during the mapping, and reaching consensus when appropriate. Although the level of agreement was interpreted to be moderate to substantial, this does not rule out that pairings are based on opinion, and that these pairings may change. Future researchers may consider repeating the pairing process, to account for their own context. Finally, the methods we used have likely missed some common engineering activities; new engineering activities will emerge over time and this emergence needs to be monitored.

4.5. Future work

In the future, we plan to map the list of common engineering activities to other frameworks, such as Engineers Australia’s competency standards, to further enhance the taxonomy’s utility. We chose not to complete mapping to Engineers Australia’s Stage 1 standard due its forthcoming review (Howard and Foley Citation2021). We will also test the construct validity of the engineering activities survey through exploratory and confirmatory factor analysis (EFA and CFA, respectively) of data from the baseline survey. EFA will allow us to investigate relationships between activities and participant demographics, while CFA will allow for the exploration of other latent constructs which may be identified during EFA.

To address the limitations of our search method, we will include an open-ended question in all future surveys to identify activities not covered by our taxonomy. Conversely, we will review the activities that are considered by survey participants as being not relevant to their engineering jobs, as these may be redundant activities. In addition, we will monitor literature and O*Net on an ongoing basis to identify emerging engineering activities.

5. Conclusion

We have developed an engineering practice taxonomy, comprising a list of 86 engineering activities, paired to 17 engineering competencies and the 11 International Engineering Alliance graduate attributes. This taxonomy represents a unified framework in describing the common activities engineers do, and has significant utility for future engineering education research.

Supplemental material

Supplemental Material

Download MS Excel (116.5 KB)

Supplemental Material

Download MS Excel (20.8 KB)

Supplemental Material

Download MS Excel (22.4 KB)

Acknowledgments

We gratefully acknowledge Sabbia Tilli and Melinda Kreth who provided further details of the surveys used in their research. We acknowledge the in-kind support of the BeLongEng Project’s peak-body supporters during the progress of this research paper; the Australian Council of Engineering Deans, ACE New Zealand, Engineers Australia, Engineering New Zealand | Te Ao Rangahau, Engineers Without Borders Australia, IEEE Australia, the Institute of Public Works Engineering Australasia (Australia and New Zealand), the Minerals Council of Australia, the New Zealand Council of Engineering Deans, and Vocational Engineering Education New Zealand.

Disclosure statement

No potential conflict of interest was reported by the authors.

Supplementary Material

Supplemental data for this article can be accessed online at https://doi.org/10.1080/22054952.2023.2214454.

Additional information

Notes on contributors

Enda Crossin

Enda Crossin is an Associate Professor and is the Director of the Engineering Management Programmes at the University of Canterbury, New Zealand. He holds a PhD from The University of Queensland, Brisbane, Australia.

Jessica I. Richards

Jessica Richards is a graduate of the Master of Science programme in psychology from the University of Canterbury, New Zealand.

Sarah Dart

Sarah Dart is Strategic Lead for Learning & Teaching Development, Impact and Recognition within QUT’s Academy of Learning and Teaching (QALT) and a Senior Lecturer in the School of Mechanical, Medical and Process Engineering at QUT, Brisbane, Australia. She holds a PhD from QUT, Brisbane, Australia.

Katharina Naswall

Katharina Näswall is a Professor in psychology at the University of Canterbury, New Zealand. She holds a PhD from Stockholm University.

References

  • ABET. 2019. Criteria for Accrediting Engineering Programs. Baltimore, Maryland: ABET Engineering Accreditation Commission.
  • American Society of Civil Engineers. 2019. Civil Engineering Body of Knowledge: Preparing the Future Civil Engineer, 3rd. American Society of Civil Engineers. 10.1061/9780784415221
  • Ashforth, B. E., D. M. Sluss, and A. M. Saks. 2007. “Socialization Tactics, Proactive Behavior, and Newcomer Learning: Integrating Socialization Models.” Journal of Vocational Behavior 70 (3): 447–462. doi:10.1016/j.jvb.2007.02.001.
  • Australian Council of Engineering Deans. 2019. Engineering Futures 2035. http://www.aced.edu.au/downloads/Engineering%20Futures%202035_Stage%201%20report%20for%20ACED_May_16_2019.pdf
  • Boateng, G. O., T. B. Neilands, E. A. Frongillo, H. R. Melgar-Quiñonez, and S. L. Young. 2018. “Best Practices for Developing and Validating Scales for Health, Social, and Behavioral Research: A Primer.” Front Public Health 6: 149. doi:10.3389/fpubh.2018.00149.
  • Borrego, M., M. J. Foster, and J. E. Froyd. 2014. “Systematic Literature Reviews in Engineering Education and Other Developing Interdisciplinary Fields.” Journal of Engineering Education 103 (1): 45–76. doi:10.1002/jee.20038.
  • Boxall, P., and M. Steeneveld. 1999. “Human Resource Strategy and Competitive Advantage: A Longitudinal Study of Engineering Consultancies.” Journal of Management Studies 36 (4): 443–463. doi:10.1111/1467-6486.00144.
  • Brunhaver, S. R. 2015. Early Career Outcomes of Engineering Alumni: Exploring the Connection to the Undergraduate Experience. Standford University. https://searchworks.stanford.edu/view/11367251.
  • Brunhaver, S. R., R. Streveler, C. Carrico, H. Matusovich, P. Boylan-Ashraf, and S. Sheppard. 21-24 Oct. 2015. Professional Engineering Pathways Study: A Longitudinal Study of Early Career Preparedness and Decision-Making. 2015 IEEE Frontiers in Education Conference (FIE), El Paso, Texas.
  • Brunhaver, S. R., A. C. Strong, B. Jesiek, R. F. Korte, and R. Stevens. 2018. Research on Engineering Practice (REP) Workshop International Network of Engineering Studies Conference, Santa Clara, California. https://inesweb.org/files/REP%20Workshop%20Bibliography%20v2.pdf
  • Campion, M. A., A. A. Fink, B. J. Ruggeberg, L. Carr, G. M. Phillips, and R. B. Odman. 2011. “Doing Competencies Well: Best Practices in Competency Modeling.” Personnel Psychology 64 (1): 225–262. doi:10.1111/j.1744-6570.2010.01207.x.
  • Crossin, E., D. Norriss, K. Näswall, F. Pawsey, and G. Rowe. 2022. The BeLongeng Project - Baseline Report.
  • Daka, E., and G. Fraser. 2014. “A Survey on Unit Testing Practices and Problems.“ 2014 IEEE 25th International Symposium on Software Reliability Engineering, Naples.
  • Engineers Australia. 2019. Stage 1 Comptency Standard for Professional Engineer. Canberra: In.
  • Frey, C. B., and M. Osborne. 2013. The Future of Employment: How susceptible are jobs to computerisation. https://www.oxfordmartin.ox.ac.uk/downloads/academic/The_Future_of_Employment.pdf
  • Garousi, V., A. Coşkunçay, A. Betin-Can, and O. Demirörs. 2015. “A Survey of Software Engineering Practices in Turkey.” The Journal of Systems & Software 108: 148–177. doi:10.1016/j.jss.2015.06.036.
  • Garousi, V., A. Coşkunçay, O. Demirörs, and A. Yazici. 2016. “Cross-Factor Analysis of Software Engineering Practices versus Practitioner Demographics: An Exploratory Study in Turkey.” The Journal of Systems & Software 111: 49–73. doi:10.1016/j.jss.2015.09.013.
  • Garousi, V., M. Felderer, M. Kuhrmann, K. Herkiloğlu, and S. Eldh. 2020. “Exploring the Industry’s Challenges in Software Testing: An Empirical Study.” Journal of Software: Evolution and Process 32 (8): e2251. doi:10.1002/smr.2251.
  • Garousi, V., and J. Zhi. 2013. “A Survey of Software Testing Practices in Canada.” The Journal of Systems & Software 86 (5): 1354–1376. doi:10.1016/j.jss.2012.12.051.
  • Howard, P., and B. Foley. 2021. Workshop. Reviewing the Engineers Australia Competencies. Research in Engineering Education Symposium & Australasian Association of Engineering Education Conference, Perth. https://rees-aaee21.org/wp-content/uploads/2021/10/REES_AAEE_2021_paper_376-final.pdf
  • International Engineering Alliance. 2014. 25 Years of the Washington Accord. Wellington. https://www.ieagreements.org/assets/Uploads/Documents/History/25YearsWashingtonAccord-A5booklet-FINAL.pdf
  • International Engineering Alliance. 2021. Graduate Attributes and Professional Competencies. Version 2021.1. Wellington: In.
  • Jesiek, B., A. Johri, C. Brozina, and R. F. Korte. 2020. Work-In-Progress: Novel Ethnographic Investigations of Engineering Work Practices. Proceedings of the American Society of Engineering Education Annual Conference, Virtual On line.
  • Kreth, M. L. 2000. “A Survey of the Co-Op Writing Experiences of Recent Engineering Graduates.” Ieee Transactions on Professional Communication 43 (2): 137–152. doi:10.1109/47.843642.
  • Mahood, Q., D. Van Eerd, and E. Irvin. 2014. “Searching for Grey Literature for Systematic Reviews: Challenges and Benefits.” Research Synthesis Methods 5 (3): 221–234. doi:10.1002/jrsm.1106.
  • Mazzurco, A., E. Crossin, S. Chandrasekaran, S. Daniel, and G. R. P. Sadewo. 2021. “Empirical Research Studies of Practicing Engineers: A Mapping Review of Journal Articles 2000–2018.” European Journal of Engineering Education 46 (4): 479–502. doi:10.1080/03043797.2020.1818693.
  • McHugh, M. L. 2012. “Interrater Reliability: The Kappa Statistic.” Biochemia Medica 22 (3): 276–282. doi:10.11613/BM.2012.031.
  • National Center for O*NET Development. (2021). O*NET OnLine. Accessed March 24 2021 from https://www.onetonline.org/
  • Obstfeld, D. 2005. “Social Networks, the Tertius Iungens Orientation, and Involvement in Innovation.” Administrative Science Quarterly 50 (1): 100–130. doi:10.2189/asqu.2005.50.1.100.
  • Page, M. J., J. E. McKenzie, P. M. Bossuyt, I. Boutron, T. C. Hoffmann, C. D. Mulrow, L. Shamseer, J. M. Tetzlaff, E. A. Akl, S. E. Brennan, R. Chou, J. Glanville, J. M. Grimshaw, A. Hróbjartsson, M. M. Lalu, T. Li, E. W. Loder, E. Mayo-Wilson, S. McDonald and D. Moher. 2021. “The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews.” BMJ 372: n71. doi:10.1136/bmj.n71.
  • Passow, H. J., and C. H. Passow. 2017. “What Competencies Should Undergraduate Engineering Programs Emphasize? A Systematic Review.” Journal of Engineering Education 106 (3): 475–526. doi:10.1002/jee.20171.
  • Patton, W., and M. McMahon. 2014. “A Systems Theory Framework of Career Development.” In Career Development and Systems Theory: Connecting Theory and Practice, 241–276. SensePublishers. doi:10.1007/978-94-6209-635-6_9.
  • Pons, D. J. 2015. “Changing Importances of Professional Practice Competencies Over an Engineering Career.” Journal of Engineering and Technology Management 38: 89–101. doi:10.1016/j.jengtecman.2015.10.001.
  • Reychav, I., and M. I. Aguirre-Urreta. 2014. “Adoption of the Internet for Knowledge Acquisition in R&D Processes.” Behaviour & Information Technology 33 (5): 452–469. doi:10.1080/0144929X.2013.765035.
  • Richards, J. 2021. Career Commitment and Turnover Intentions in Practising Engineers. Christchurch: University of Canterbury.
  • Robinson, M. A. 2010. “An Empirical Analysis of engineers’ Information Behaviors.” Journal of the American Society for Information Science and Technology 61 (4): 640–658. doi:10.1002/asi.21388.
  • Robinson, M. A. 2012. “How Design Engineers Spend Their Time: Job Content and Task Satisfaction.” Design Studies 33 (4): 391–425. doi:10.1016/j.destud.2012.03.002.
  • Sheppard, S. D., K. Macatangay, A. Colby, and W. M. Sullivan. 2009. Engineers: Designing for the Future of the Field. San Francisco: Jossey-Bass.
  • Shippmann, J. S., R. A. Ash, M. Batjtsta, L. Carr, L. D. Eyde, B. Hesketh, J. Kewoe, K. Pearlman, E. P. Prien, and J. I. Sanchez. 2000. “The Practice of Competency Modelling.” Personnel Psychology 53 (3): 703–740. doi:10.1111/j.1744-6570.2000.tb00220.x.
  • Stevens, R., A. Johri, and K. O’Connor. 2014. “Professional Engineering Work.” In Cambridge Handbook of Engineering Education Research, edited by A. Johri and B. M. Olds, 119–138. Cambridge University Press. doi:10.1017/CBO9781139013451.010.
  • Stevens, R., and A. Vinson. 2016. Institutional Obstacles to Ethnographic Observation in Engineering Industry. 2016 ASEE Annual Conference & Exposition, New Orleans, Louisiana.
  • Trevelyan, J. 2008. A Framework for Understanding Engineering Practice ASEE 2008 Annual Conference & Exposition, Pittsburgh, Pennsylvania. https://strategy.asee.org/3319
  • Trevelyan, J. 2014. “Towards a Theoretical Framework for Engineering Practice.” In Engineering Practice in a Global Context. Understanding the Technical and the Social, edited by B. Williams, J. Figueiredo, and J. Trevelyan, 33–60. London: CRC Press.
  • Trevelyan, J. 2019. “Transitioning to Engineering Practice.” European Journal of Engineering Education 44 (6): 821–837. doi:10.1080/03043797.2019.1681631.
  • Trevelyan, J., and S. Tilli. 2008. Longitudinal Study of Australian Engineering Graduates: Perceptions of Working Time. 2008 ASEE Annual Conference and Exposition, Pittsburgh, Pennsylvania.
  • Western, J., M. Haynes, D. A. Durrington, and K. Dwan. 2006. “Characteristics and Benefits of Professional Work: Assessment of Their Importance Over a 30-Year Career.” Journal of Sociology 42 (2): 165–188. doi:10.1177/1440783306058482.
  • Wikipedia, 2020. List of engineering societies. Accessed December 16 2020. https://en.wikipedia.org/wiki/List_of_engineering_societies
  • Williams, B., and J. Figueiredo. 2011. Engineering Practice–An Empirical Study. Proceedings of the SEFI Annual Conference. Lisbon