1,256
Views
0
CrossRef citations to date
0
Altmetric
Research Article

Are we ahead of the trend or just following? The role of work and organizational psychology in shaping emerging technologies at work

ORCID Icon, ORCID Icon, ORCID Icon, ORCID Icon & ORCID Icon
Pages 120-129 | Received 10 Dec 2023, Accepted 22 Feb 2024, Published online: 23 Mar 2024

ABSTRACT

This position paper elaborates on three core themes that emerged from a panel discussion that was held at the 21st Congress of the European Association of Work and Organizational Psychology. The authors of this paper discussed the status quo and the future of Work and Organizational Psychology (WOP) research and practice amidst the advent of emerging workplace technologies. The discussion centred on the question of what role WOP scholars and practitioners should take within this interdisciplinary field and how future research should evolve from previous studies on automation. The paper systematically examines: (1) emerging technologies as a new type of technological change, (2) WOP’s role in the design of emerging technologies and socio-technical systems, and (3) hindrances in WOP becoming more involved. Based on our reflections regarding each of these themes, we propose seven actionable recommendations to move forward and to encourage the involvement of WOP in the development and implementation of emerging technologies at work.

Introduction

With its potential to address critical societal issues, the implementation of emerging technologies, such as artificial intelligence (AI), is booming in all areas of our lives, including work. As technologies become increasingly capable and autonomous, they will fundamentally transform how we will work in the future (Parker & Grote, Citation2022). Yet, despite the clear goal to extend human capabilities at work, Work and Organizational Psychology (WOP) perspectives are rarely considered in technology development and implementation. Beyond how operators react to certain characteristics of systems, especially user interfaces, we still know surprisingly little about how emerging technologies affect people at work and how they may change the design of work in the future. WOP rarely takes an active role in technology development and has been content with taking a descriptive or explanatory stance on technology implementation and use. Only recently, the calls to become more actively engaged in shaping the future of emerging technologies at work have become louder – but are we already too late to the party? What role should WOP play in technology development and implementation?

At the 21st Congress of the European Association of Work and Organizational Psychology, the authors of this position paper organized a panel discussion to examine the current and future directions of WOP research and practice in the field of emerging technologies at work. The aim of the discussion was to explore how WOP can contribute to this field, what role WOP researchers and practitioners should play in shaping technology development and implementation, and whether we need to depart from prior approaches that are typically applied to the topic of technologies in the workplace. The dialogue yielded three principal themes for this position paper: (1) emerging technologies as a new type of technological change, (2) WOP’s role in the design of emerging technologies and (re)design of work, and (3) hindrances in WOP becoming more involved. We conclude with seven central recommendations for involving WOP researchers and practitioners in shaping the future development and implementation of emerging technologies at work.

Are we facing a new type of technological change?

The study of how humans interact with technology has a longstanding tradition in various domains (e.g., human factors). And although technology has been continuously improving over the past decades, with the introduction of emerging technologies (e.g., generative AI), many have argued that we are facing a new type of technological change (Endsley, Citation2023; Grote, Citation2023; Larson & DeChurch, Citation2020; Parker & Grote, Citation2022). In reference to these recent discussions in the literature that highlight several core aspects distinguishing the current emerging technologies from prior technology related change processes, the panellists discussed three central aspects.

First, there is an increase in system autonomy. Compared to classical levels of automation research (Kaber & Endsley, Citation2004; Parasuraman et al., Citation2000), today’s technical systems are likely to reach higher levels of automation, with humans being withdrawn from work processes and only overseeing systems that not only perform actions automatically, but also take all relevant decisions on their own (Parker & Grote, Citation2022). Further, while automation has in the past been restricted to narrow sets of task, emerging autonomous technologies may allow for a higher number and range of tasks being autonomously processed (Bankins et al., Citation2023). Consequently, increasing system autonomy will allow for the delegation of new types of tasks and roles to these systems, substantially changing their role, possibilities for implementation, and importance within the organizational context (e.g., “from tool to teammate”; Phillips et al., Citation2011; Seeber et al., Citation2020). Depending on the domain of implementation, these high levels of autonomy can range from fully automating human roles, such as in the case of driverless taxis (e.g., Cruise), to allowing for higher levels of precision or lower cognitive load through shared decision-making, for instance in the domain of medical decision making (e.g., Reverberi et al., Citation2022). At the same time, researchers have pointed out the effects of these high levels of automation and system autonomy on the quality of work. For example, experimental studies in the domain of personnel selection or decision support indicate that system autonomy may impact users’ decision making satisfaction, self-efficacy, and stress (Langer, König, et al., Citation2021; Ulfert et al., Citation2022). Further, high levels of autonomy may change the value of human work, as some tasks may be autonomously completed by the technology (Berg et al., Citation2023).

A second distinguishing factor of emerging technologies is the decrease in system transparency. Whereas previous systems were often based on deterministic processes that were complex but, at least, in principle understandable given enough time and resources, today’s systems become increasingly difficult to understand. Contemporary technical systems frequently incorporate components leveraging state-of-the-art machine learning algorithms that exhibit a lack of transparency in the way they arrive at specific decisions. Reasons for increasing system opacity are manifold (see, e.g., Burrell, Citation2016; Langer & König, Citation2021). Although workers’ struggles with transparency have long been an issue in earlier automation forms (see e.g., Bainbridge, Citation1983), the advent of current AI systems introduces a novel challenge: sometimes, even the creators of these technologies lack a complete understanding of them (Anthony et al., Citation2023). This lack of transparency combined with, for example, lowered possibilities for completely understanding the underlying rationale of a system’s decision, raises new questions like, “Can we reasonably expect that human workers will be able to oversee systems that they cannot fully understand?”, “How much and what type of transparency should be provided by the system in order to enable this?”. And “Who can be held responsible when wrong decisions are being made: the human worker, the system, or its developers?”.

Third, the panel delved into self-learning capabilities of emerging technologies. The landscape of AI applications is evolving at a rapid speed. At the time of the panel discussion, many systems used in organizational practice did not have the ability to continuously learn from new inputs. Yet, with the spread of advanced AI applications, such as the growing adoption of large-language model-based tools, it is conceivable that we are on the verge of witnessing a surge in systems that can dynamically adapt their decision-making processes in response to user inputs and changing environmental conditions. This poses a paradigmatic shift towards systems that may not only have opaque decision processes but whose decision processes may even change over time, leaving operators unsure about whether the system’s reliability may become better or worse. Prior literature has discussed the fact that technological features change over time (Nelson & Irwin, Citation2014; Tyre & Orlikowski, Citation1994). However, emerging technologies are enabled to dynamically change, even without intervention of developers, which further increases difficulties for operators to be aware of the system’s states.

Although technical systems are becoming increasingly competent, the study of humans working with technological systems has a long tradition, with sociotechnical systems theory (STS) as a notable contributor to this field (Trist, Citation1981; Trist & Bamforth, Citation1951). STS theory developed a framework for describing how individuals within a particular system engage with its technology to shape joint results while functioning within a wider environment (Emery, Citation1959; Makarius et al., Citation2020). This stream of literature has predominantly focused on how humans and technical systems should work together, how technology should be designed in terms of appearance or functionality, or how automation may impact employees. For example, research on imperfect automated systems (McBride et al., Citation2014) has examined how people deal with system errors and how they use system outputs knowing that these occasionally can be faulty. Function allocation research (e.g., de Winter & Dodou, Citation2014; Fitts, Citation1951; Grote et al., Citation2000; Hancock & Scallen, Citation1996; Waterson et al., Citation2002) has focused on how to allocate functions in work tasks that suit the abilities of humans and systems. In a similar vein, action regulation theory based literature, offers a crucial perspective and tools for structuring work processes in technology-rich work environments (e.g., Hacker, Citation2003, Citation2022; Wächter et al., Citation1989). Specifically, this literature stresses the importance of employees’ decision latitude and control over their work, suggesting that technology should not completely take over decision-making (Hacker, Citation2022). Like STS, this stream of literature focusses on human-centric work practices and emphasizes the importance of considering employees’ experience and mental well-being in technology implementation. Similarly, work design research has pointed at the importance of employee motivation and satisfaction in the context of technology implementation (Parker & Grote, Citation2022). In this literature stream technology is often considered as a contextual factor (e.g., Morgeson & Campion, Citation2003) or ergonomic design as a physical demand of working environments, adding another important perspective to how work should be designed in the context of emerging technologies (Parker & Grote, Citation2022).

In terms of how automation may impact employees, various perspectives have been taken. Most research has focused on mis- or dis-use of technologies and how to improve operators’ willingness to use technologies appropriately. For instance, Mosier et al. (Citation1996) defined automation bias as a consequence of individuals utilizing the outcome of the decision aid “as a heuristic replacement for vigilant information seeking and processing” (p. 205). Parasuraman and Manzey (Citation2010) describe complacency and automation bias effects in a theoretical model centring around attentional challenges that people may face in operating and supervising highly reliable systems. Research on trust in automation (Hoff & Bashir, Citation2015; Lee & See, Citation2004) has brought many insights into system-, human-, and task-related factors contributing to over- or under-trusting technology. Similarly, situation awareness research highlights the role of system characteristics for users’ awareness of their task and environment (Endsley, Citation1995, Citation2017). More recent works point towards humans and technology collaborating even more closely in the future with the abovementioned improving capabilities allowing for co-learning, co-creation, or co-adaptation (van Zoelen et al., Citation2021).

Are we facing a new type of technological change? – Yes, partially. It is undeniable that current technological trends are transforming work as we know it. While certain aspects of this change resemble prior technologization developments, the enhanced capabilities of emerging technologies (e.g., in terms of autonomy or learning capabilities) present new and unique challenges that require fresh perspectives. At the same time, it is important to emphasize that this does not render all previous WOP research obsolete; instead, it underscores the need for adaptation and evolution. As recent literature suggests (Grote, Citation2023; Parker & Grote, Citation2022), researchers can build on the strong theoretical foundation that already exists and consider those very theories within new contexts (e.g., the role of STS theories for AI at work). The existing body of knowledge on what constitutes motivating, meaningful, and not (too) strenuous work (cf. Parker, Morgeson, et al., Citation2017) can be used in decisions around the implementation of emerging technologies (Berkers et al., Citation2023). Nevertheless, the current technological changes also result in new needs for technology development, research, and implementation. We will now discuss these needs, as well as the question whether WOP should play a role in fulfilling them, in more detail.

Designing emerging technologies: what is needed, and should WOP play a role?

The evolving context of emerging technologies and their roles within organizations gives rise to diverse design needs. In addressing the topic of design, and to fully understand the role of emerging technologies at work, it is however crucial to broaden the perspective beyond just the technological aspects, such as the development of the technology’s architecture. Thoughtful design of the work environments in which these technologies are integrated is becoming increasingly essential and requires the integration of diverse perspectives.

First, there is a need to consider diverse stakeholder views in the design and implementation process. In this context, whereas prior research has focused on users, it is crucial to gain deeper insights into all stakeholders involved in designing and implementing emerging technologies. This approach is congruent with STS principles for system design (e.g., Clegg, Citation2000, principles 4–6), which suggests that design should consider diverse stakeholder needs and be socially shaped. In WOP, an increasing number of studies focus on people who do not operate systems but are affected by systems and their outputs (Berkers et al., Citation2023; Langer & Landers, Citation2021). This perspective is vital, considering future workplaces will likely encompass not only technologies visible to users but also an array of systems operating beyond the immediate awareness of employees, this perspective is vital (Anthony et al., Citation2023). Further, to better understand how future work should be designed, researchers have highlighted the need for more consideration of system developers, policy-makers, and employee representatives and how they contribute to designing and implementing emerging technologies at work (Anthony et al., Citation2023; Langer, Oster, et al., Citation2021; Parker & Grote, Citation2022). This discussion also highlights the wider system impact that must be considered when introducing emerging technologies and the need for considering these perspectives in various phases of technology development, from initial conception to implementation in organizations. Arguably, considering different stakeholders’ perspectives will also come with costs and will reveal misaligning interests, for instance, in the development and implementation of new technologies at work (Bankins et al., Citation2023). For example, when discussing the level of transparency of AI-based performance evaluation systems in organization, the interest of employees may likely be that they want to know exactly how they can achieve a better evaluation. Yet, this may not be in the interest of the organization as this kind of transparency may allow employees to game the evaluation system (Langer & König, Citation2023). Creating a better alignment of such divergent views and allowing for dialogue across stakeholder groups is essential not only for the effective design of technology but also for understanding the underlying social processes that impact implementation (Chhillar & Aguilera, Citation2022; Grote, Citation2023).

Second, research in this domain needs to go beyond a mere “productivity focus” to a broader sense of effectiveness that considers psychosocial work aspects related to the quality of working life and employee engagement and wellbeing (Berkers et al., Citation2023). Much of prior research on emerging technologies has been guided by task performance outcomes. Yet, it becomes increasingly clear that the field needs to also consider a broader perspective on its dependent variables – many of which are central to the research area of WOP as they relate to demands and resources that may change through the interaction of humans and technology, to role perceptions, and employee health and wellbeing (Parker & Grote, Citation2022). Future research ought to concentrate more on offering design guidelines for various stages of technology development and utilization, encompassing the pre-implementation, during implementation, and post-implementation phases within organizations. As experts in micro- and meso-level social factors affecting performance and wellbeing, WOP can provide expert input (e.g., on how humans collaborate at work, how work design impacts wellbeing, etc.), thereby going beyond technocentric views on human-technology interaction and technology implementation. For instance, WOP experts can apply work design theories to leverage the benefits of effective work arrangements when introducing robotic systems. This approach can also play a crucial role in minimizing the risks associated with inadequately designed roles, which may adversely affect employee wellbeing and overall performance (Berkers et al., Citation2023; Hacker, Citation2022; Holman, Citation2013; Parker et al., Citation2019). Further, WOP can provide “design, evaluation, and optimization targets” for developers – for instance, design for optimal workload, for overseeability, or sustainable performance.

Getting involved in the early stages of the technology design process provides various benefits. For instance, when employee needs are already considered in the design of technologies, this may improve human-technology interaction, lead to higher quality jobs (e.g., more variety in tasks and better skill utilization), improve job outcomes (e.g., wellbeing, performance), or even support sustainable organizations (Landers & Marin, Citation2021; Parker & Grote, Citation2022). Currently, design decisions are typically made by the developers of the technologies. Taking WOP perspectives into account may further allow for moving away from the need for users to adapt to the technology, as every user has to work with the same system, to more adaptive systems that recognize user needs (e.g., in robotics; Gasteiger et al., Citation2023).

Designing emerging technologies: What is needed, and should WOP play a role? – Yes, we need WOP perspectives both in technology design and in socio-technical systems design! WOP researchers and practitioners should take an active role in technology and in implementation design, step forward, and get involved earlier on in the process. Instead of merely reacting as firefighters who have to put out wildfires when poorly-designed technology meets poorly-prepared workers, we should take a proactive role as system designers where wildfires are anticipated and prevent them from starting by creating work systems that consider employees’ and other stakeholders’ needs. Getting involved in designing new technologies and socio-technical systems that benefit employees will be an essential step in better understanding what collaborations between employees, organizational decision makers, engineers, and designers – including their relational dynamics – should look like. However, this also comes with the requirement that WOP researchers and professionals need to be prepared in taking this more active role in the design of emerging technologies.

What is hindering WOP from becoming more involved?

Technology development is booming, and most organizations are set to increase the implementation of AI systems, robotics, and ICTs in various domains in the upcoming years. Yet, this change as well as technology development research is currently still driven by disciplines other than WOP. While making it a goal to involve WOP in technology design processes is crucial (including both the design of the technology and its implementation), the discipline encounters challenges in pursuing this objective. What is hindering WOP researchers and practitioners from becoming more involved? The panellists identified a variety of challenges.

Challenges in conducting research and developing theory

In recent years, the number of WOP publications dedicated to emerging technologies (e.g., digitalization, AI) has been increasing (e.g., Beer & Mulder, Citation2020). Yet, most WOP publications in this domain are theoretical contributions, qualitative studies, or empirical studies using vignette methodologies. The choice of vignette studies is often necessitated by the unavailability of specific technological innovations within organizations (e.g., novel types of AI applications), not having access to organizations to conduct field studies, or the need for standardized environments to investigate specific research questions (Matza et al., Citation2021). This latter approach has limitations, especially in capturing real-world complexities and dynamics during the practical implementation and use of emerging technologies. Moreover, the predominant thematic focus in current research tends to centre on the repercussions of technological change, with relatively less attention given to prospective considerations regarding the interaction between technology and work design and how they should be synergistically considered in implementation processes. Otherwise, technology implementation may result in poorly designed jobs that negatively affect employee wellbeing and performance (Humphrey et al., Citation2007).

In contrast to WOP, other disciplines (e.g., human-computer interaction or computer science) offer abundant empirical works on the dynamics of human-technology interaction that could contribute to the WOP literature. Yet, at present, these studies lack thorough integration into WOP literature, as WOP authors typically do not consider them. This fragmentation of knowledge across disciplines hinders the development of a more holistic understanding of human-technology interaction, impeding the development of comprehensive and actionable insights for both academia and industry. The panellists see two main reasons for this lack of integration. First, many research fields and their publication cultures are strongly characterized by their (mono)disciplinary orientation. Literature from other disciplines is often not considered in WOP, limiting WOP’s perspectives on technological topics. Similarly, technological fields (e.g., engineering) may fail to consider important insights and theories from WOP (e.g., antecedents of employee wellbeing). This leads to a limited representation of WOP topics in technical fields, as psychological theories are only partially considered. Second, with many disciplines now engaging in technology-oriented research, the body of scientific publications and new technology-oriented journals is increasing drastically. This growth makes it nearly impossible to integrate important insights as the number of publications is becoming too large to keep track of. Further, as disciplines work on similar topics in parallel, this can lead to the risk of multiple disciplines “reinventing the wheel”. Better integration of what we already know about emerging technologies at work could be approached by fostering more interdisciplinary or even transdisciplinary collaboration. Yet, such collaborations come with their own unique challenges.

Challenges in interdisciplinary collaboration

First, integrating different disciplines is challenging due to differences in the terminologies they use or their language. For instance, when the diverse areas of research mentioned earlier refer to the concept of fairness, this can mean very different things: a broad evaluation of a decision as “good” (Colquitt et al., Citation2005) or mathematical definitions of how to implement algorithmic fairness, including considerations of disparate treatment or adverse effects when using algorithms (Mitchell et al., Citation2021). Further, definitions of the technologies may differ and strongly vary in their level of specificity, as authors may refer to a broad category of technologies (e.g., “AI”), specific technological tools (e.g., decision-support system for cancer diagnostics), or specific algorithms.

Second, it can be challenging to publish interdisciplinary work because research methods or theories do not match expectations and requirements of reviewers of current (“monodisciplinary”) publication outlets. Reviewers may find it challenging to assess interdisciplinary research adequately. This applies to methods that may be novel to other disciplines or literature and theories that specific disciplines may not be aware of. Furthermore, reviewers in different disciplines have different standards they have been trained to review. For instance, in psychology and management research, there is a long tradition of valuing “theoretical contributions”. A manuscript that focuses more on design implications for the design and implementation of algorithmic systems may thus have no chance in psychology and management outlets. Conversely, HCI research values “design implications”, that are rarely found in psychology and management papers. Thus, psychologists trying to publish in HCI outlets may have a difficult time getting published with theoretical contributions that do not (also) aim for design implications.

Wop’s disciplinary culture can be a challenge

The panellists, however, do not only see challenges in conducting research, collaborating with technology experts, and publishing interdisciplinary work. Rather, traditions within the discipline of psychology can be hindering, especially in educating future WOP and for junior researchers.

Technology-related topics are typically not considered part of “traditional” psychology curricula. Although human factors used to be a much more prominent topic in psychology degrees, Norcross and colleagues (Norcross et al., Citation2016) report a decrease of Human factors, and WOP courses more generally, between 2005 and 2014. This is despite scholars and psychological associations pointing to the relevance of human factors education for school curricula (Norcross et al., Citation2016; Parker & Grote, Citation2022; Tenenbaum, Citation2022). Consequently, many psychology students, as well as educators, lack an understanding of emerging technologies, their functioning, and their potential impact. Traditionally, psychology research has treated technology as a monolithic thing (Landers & Marin, Citation2021). Today, there is increasing awareness that many design choices are involved in developing technology and designing human-system work processes that require understanding both the human and the technological side of things. Utilizing the principles of STS theory entails a proactive approach where potential outcomes of different work design choices, especially relating to the implementation of emerging technologies, are thoroughly evaluated beforehand. Additionally, STS theory stresses the importance of engaging employees in the underlying decision-making processes (Berkers et al., Citation2023; Parker & Grote, Citation2022; Richter et al., Citation2018). Simultaneously, work design can be constrained by factors such as technology characteristics or organizational constraints, as noted by Parker and colleagues (Parker, Van Den Broeck, et al., Citation2017). These limitations can subsequently influence the outcomes and repercussions stemming from the decisions made and actions performed by different stakeholders during the technology implementation process (Berkers et al., Citation2023; Strohmeier, Citation2009).

Fostering this understanding is not yet an integral part of many WOP courses. This leaves students with a slightly more nuanced understanding of the human side but a rough (and often still monolithic) understanding of technology. This lack of “the full picture” can lead to challenges when researching human-technology interaction, as students or researchers may be unaware of the diversity of technology. We need to educate future WOP researchers and practitioners to enact a more active role in this domain. This means not only providing them with a better understanding of emerging technologies but also educating them in developing strategies and skills to make related psychosocial aspects more visible to other stakeholders. We will address the latter issue in more detail in the next section.

Furthermore, we need to create opportunities for junior researchers to facilitate them making such interdisciplinary topics part of their careers. Current gatekeeping practices are preventing WOP to get involved in the development and implementation of workplace technologies. Third-party funding has similar challenges: You can either frame an interdisciplinary research project in a way that it sounds “monodisciplinary”, that plays along with the rules and adheres to the culture of your discipline, or you can apply to limited funding outlets (e.g., Horizon Europe) that specifically call for interdisciplinary research. The consequence is that with challenges in publishing and in securing third-party funding, we are also preventing people doing interdisciplinary research from getting into positions that could change the system as they will be less likely to make an academic career.

Challenges in getting involved in the development and implementation of emerging technologies

Although WOP research already offers essential disciplinary knowledge and related skills for developing and implementing emerging technologies at work, WOP practitioners also require a better standing in such processes and should claim their rightful role. Current research highlights this gap between potential contribution and actual involvement. Based on the results of a study among logistics warehouse workers, Berkers and colleagues (Berkers et al., Citation2023) highlight the importance of involving HR in such processes and report “Remarkable was the absence of HR as an organizational stakeholder in the decision-making and implementation processes across all warehouses” (p. 1863), and “The body of knowledge on what constitutes motivating, meaningful, and not (too) strenuous work (cf. Parker, Morgeson, et al., Citation2017), unfortunately, was hardly used in decisions about implementation or work design.” (p. 1867). Similarly in a study among production workers, “HR professionals were not involved and, therefore, miss out on a crucial opportunity to be of an added value” (Wolffgramm et al., Citation2021, p. 101). Current technology development and implementation practices regularly involve an overemphasis on the technology’s potential for improving performance (Wolffgramm et al., Citation2021), rather than highlighting the role of the employees. Although HR is just one role in which WOP practitioners are stakeholders of technologization processes, and WOP may be involved in many other roles (e.g., Engineering Psychologists), these findings highlight an important problem: the role of HR in technology implementation processes is relatively small even though the impact of change on employees’ jobs can be large. Yet, HR is typically the first line of contact when problems arise (for both employees and for management). At the same time, this puts HR is in a position of tension, having to deal with both people-centred and business-centred interests (Keegan et al., Citation2019), which may significantly diverge in the context of technology implementation (Langer & König, Citation2023). Despite the specialization of other WOP professionals on human-technology interaction, not all organizations can afford or have the opportunity to work with, for instance, ergonomic scientists who could contribute to the implementation process from a human-centric perspective. At present, perspectives from HR stakeholders are frequently underappreciated while perspectives from other (potential) WOP stakeholders are not always included, as developers and organizations implementing emerging technologies might not fully recognize the critical role of the human factor in achieving successful technology implementation and use.

What is hindering WOP from becoming more involved? In conclusion, WOP researchers and practitioners face various challenges in their work in this interdisciplinary field. The panellists agree that WOP can already substantially contribute to research and practice in the domain of emerging technologies. However, we need to get more involved now to avoid getting left out in theoretical debates as well as organizational practices around technology and (work)system design, implementation, and policy making.

How can we ensure that WOP perspectives are heard and of value to the design of emerging technologies?

Throughout the discussion, the panellists raised diverse challenges and needs that WOP should address in the future. Scholars in our field have previously emphasized the lack of theory and research in the domain of technology at work and the need for different approaches (Landers & Marin, Citation2021). At the same time, ongoing technological, and especially AI, development has raised the sense of urgency for WOP to get more involved. To move forward and to encourage the involvement of WOP in the development and implementation of emerging technologies at work, we propose eight actionable recommendations for our field.

Recommendation 1:

Adopting an interdisciplinary mindset

With topics surrounding emerging technologies (e.g., AI) on the rise, relevant publications are emerging in many different research disciplines. While it is impractical to consider all these different disciplines, occasionally delving into literature from disciplines outside one’s own can be enlightening. We thus suggest WOP researchers to, at least occasionally, read into how similar research questions are addressed in other disciplines (e.g., “How does computer science discuss the concept of trust?”) and integrate some of these insights in their own research (see, e.g., Ulfert et al., Citation2023, for an example of an interdisciplinary discussion of trust). Researchers seeking a thorough overview of technology in the workplace must be aware of relevant literature spanning diverse fields. These include, but are not limited to (with an example of an important outlet of the respective area of research in the bracket): human factors and ergonomics (e.g., Human Factors), information systems research (e.g., Management Information Systems Quarterly), management (e.g., Academy of Management Review), work and organizational psychology (e.g., Journal of Applied Psychology, European Journal of Work and Organizational Psychology), human-computer interaction (e.g., Proceedings of the CHI Conference on Human Factor in Computing Systems), artificial intelligence (e.g., Artificial Intelligence), safety science (e.g., Safety Science), sociology (e.g., Big Data and Society), philosophy (e.g., Synthese), law (which has a quite different way of conducting research so we cannot point to a single important outlet there), journals that cross disciplinary boundaries (e.g., Technology in Society; Human Relations) as well as diverse application fields, such as medicine, production, or transportation. This list of research fields and journals is neither exhaustive nor detailed enough, given the diversity within these research areas.

Further, there is a strong need for integrative conceptual review papers designed to synthesize relevant literature, highlight potential synergies between disconnected lines of research, extend theoretical development, and propose new directions for future research. Journals such as JAP, Psychological Science in the Public Interest, JoM review issue, and Academy of Management Annals already welcome this type of paper. These measures may help to build bridges between disciplines, as they highlight various disciplinary perspectives on themes related to emerging technologies in the workplace and consequently in developing a common language between disciplines.

We recommend that WOP researchers reach out to their colleagues from other disciplines within their own university and network who perform research on topics related to emerging technologies in the workplace, e.g., design, human factors/ergonomics, philosophy, or organization science perspective. Second, we recommend WOP researchers also think about their research’s “design implications” – how could insights translate into actual technology design or socio-technical system (i.e., implementation) design choices? Lastly, moving towards interdisciplinary work in WOP, researchers and journal editors play an important role. Thus, we recommend that editorial teams should further promote the submission of manuscripts examining issues at the intersection of the involved disciplines that are elaborated by multidisciplinary research teams. This includes the study of interdisciplinary topics and the use of interdisciplinary methods.

Recommendation 2:

Integration of technology-oriented and design topics in psychology curricula

As technology development is happening at an increasing speed, gaining technology-related knowledge and skills is becoming a pressing issue within psychology curricula. Thus, we suggest that universities should implement technology related as well as interdisciplinary courses that can foster students’ skills for collaboration across disciplines. For instance, project courses can be introduced into curricula where psychology and computer science students develop systems together, e.g., Challenge-Based Learning (Johnson et al., Citation2009). Theory courses can provide further room for discussion between psychology and computer science students, for instance, about classical psychological theories and their implications for system design and implementation. Further, courses should be implemented wherein the large body of human factors and ergonomics literature is considered.

Psychology curricula should integrate WOP and technology topics earlier in the curriculum when students decide on their future occupational direction thereby embracing the value of psychology students developing interdisciplinary expertise. Due to the broadness of the field, graduating psychology students already find work opportunities in diverse industries. Gaining even a little experience in multidisciplinary work and technical topics will allow for even more career opportunities in diverse fields, such as user experience and design, information security, data science, ethical advisory boards, technology audits (e.g., TÜV), and many more.

Recommendation 3:

Creating opportunities for academic careers focused on interdisciplinary research in WOP

Interdisciplinary work comes with a diverse set of challenges in collaborating with other researchers, conducting studies, or publishing such work. We believe that interdisciplinary work and the efforts associated with it should be incentivized or at least made easier. This can mean valuing design implications, valuing more “practice-focused” research without losing sight of generalizable insights, or valuing publications in outlets from other disciplines. This is particularly relevant for young scholars, whose career opportunities may be limited if editorial boards of journals, reviewers of third-party fundings, and selection committees of universities do not consider multidisciplinary work as valuable or “high ranking” enough.

Recommendation 4:

Highlight the value of WOP perspectives in technologization processes

Stakeholders (e.g., engineers) need to be made more aware of WOP’s central role and unique contribution to technologization processes. In engineering, we can estimate the consequences of implementing a robot in manufacturing, for instance with respect to plant productivity; such estimates are much more challenging for psychological outcomes. This also includes rethinking the role of WOP and going away from making the human fit the technology to establishing a human-technology-task fit that considers both sides (fitting the human to the technology and fitting the technology to the human). For example, WOP researchers could teach company engineers about the role of work (re)design in enhancing the outcomes of advanced technology implementation. Moreover, WOP professionals could help manage this transition by giving relevant strategic input to higher management about the importance of work design and advocating for employees and their involvement (Berkers et al., Citation2023). Not only in practice, but also in research, WOP perspectives should play a central role in the development of new socio-technical systems. As AI is transforming employees’ jobs, these perspectives will be essential for designing effective human-AI collaboration that can foster employee well-being too.

Recommendation 5:

Becoming normative

As WOP is an applied science and, as such, is called upon to find solutions to problems based on how “things ought to be” (Simon, Citation1996), we suggest that WOP should “go back” to being more normative. Much of early work in WOP, for example, in selection, followed this “engineering” approach in technology design and implementation. WOP has also been involved in “human factors engineering” adding social considerations to technical solutions, both content-wise, for example, by means of work design criteria and criteria for human-centred automation, and regarding processes, for instance, through emphasizing user participation (Clegg, Citation2000; Mumford, Citation1983; Symon & Clegg, Citation2005). Research in this domain often followed the action research paradigm, which later became criticized as insufficiently “scientific” (Avison et al., Citation2001; Baskerville & Wood-Harper, Citation2016). Other disciplines, especially engineering, are already taking a normative stance (e.g., work with ISO norms). Moreover, these disciplines are looking for a more normative stance by WOP in technology development and implementation (e.g., providing concrete indicators of user reactions). Therefore, we need to go back to the above-mentioned research traditions and revitalize them with the knowledge garnered in the meantime through more explanatory and theory-driven research. By being more normative, WOP can become more relevant in a range of ongoing discussions around technology and the future of work more broadly. WOP knowledge can and should inform not only individual technology projects but also policy decisions such as the current efforts to regulate the use of AI in language-based tools, such as chatbots. Being more normative also entails being willing to take a stance and accept the responsibility of guiding practice. To build this attitude into WOP research and practice, education is crucial.

Recommendation 6:

Making the involvement of multiple stakeholder groups, including WOP, a standard practice

The importance of organizational change management consistently emerges as a critical factor for the successful adoption of emerging technologies. This includes the involvement of multiple stakeholder groups in the change process. Similarly, AI governance literature suggests that the integration of multiple stakeholders in AI design processes should become a norm (Chhillar & Aguilera, Citation2022). Embracing human-centred approaches and considering STS, action regulation theory, and work design principles can effectively highlight and address the needs and expectations of all relevant stakeholders before, during, and after implementation. Yet, at present, technocentric implementation processes are often still the norm. Growing awareness among higher management of the contributions that WOP can make to a successful implementation and adoption of emerging technologies, may grant WOP practitioners better access to all relevant stakeholders. Hence, it is imperative that we promote the integration of WOP perspectives as a standard practice in both the design and implementation of technology more vigorously.

Recommendation 7:

Contributing to policy-making

We believe that the influence of WOP should not be restricted to research and practice. Social perspectives should especially be considered when discussing the future of technology in society. Thus, we believe WOP should get involved in policy-making on AI and technology more generally, collaborate with labour economists to bridge micro and macro approaches to shaping the future of work, building institutional bridges between EAWOP and associations of labour economists, human factors engineering, information systems research and so forth. Technological innovation and its implementation should be driven by societal needs. WOP can help to shift the focus of current political discussions towards employees, which is urgently needed.

We have discussed the diverse roles that WOP can and should play in the development, implementation, and use of emerging technologies at work. Although getting involved can be challenging, we aim to encourage scholars and practitioners to proactively get involved in this field and share their unique expertise to make the future of emerging technologies employee-centric rather than technocentric.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

The participation of Vicente González-Romá was supported by research grant Prometeo2021/048 from Generalitat Valenciana. In accordance with Taylor & Francis policy and our ethical obligation as researchers, we are reporting that we do not have any financial or business interests or are receiving funding from a company that may be affected by the research reported in the enclosed paper.

References

  • Anthony, C., Bechky, B. A., & Fayard, A.-L. (2023). “Collaborating” with AI: Taking a system view to explore the future of work. Organization Science, 34(5), 1672–1694. https://doi.org/10.1287/orsc.2022.1651
  • Avison, D., Baskerville, R., & Myers, M. (2001). Controlling action research projects. Information Technology & People, 14(1), 28–45. https://doi.org/10.1108/09593840110384762
  • Bainbridge, L. (1983). Ironies of automation. In G. Johannsen & J. E. Rijnsdorp (Eds.), Analysis, design and evaluation of man–machine systems (pp. 129–135). Pergamon. https://doi.org/10.1016/B978-0-08-029348-6.50026-9
  • Bankins, S., Ocampo, A. C., Marrone, M., Restubog, S. L. D., & Woo, S. E. (2023). A multilevel review of artificial intelligence in organizations: Implications for organizational behavior research and practice. Journal of Organizational Behavior, 45(2), 159–182. job.2735. https://doi.org/10.1002/job.2735
  • Baskerville, R. L., & Wood-Harper, A. T. (2016). A critical perspective on action research as a method for information systems research. In L. P. Willcocks, C. Sauer, & M. C. Lacity (Eds.), Enacting research methods in information systems: Volume 2 (pp. 169–190). Springer International Publishing. https://doi.org/10.1007/978-3-319-29269-4_7
  • Beer, P., & Mulder, R. H. (2020). The effects of technological developments on work and their implications for continuous vocational education and training: A systematic review. Frontiers in Psychology, 11, 918. https://doi.org/10.3389/fpsyg.2020.00918
  • Berg, J. M., Raj, M., & Seamans, R. (2023). Capturing Value from Artificial Intelligence. Academy of Management Discoveries, 9(4), 424–428. https://doi.org/10.5465/amd.2023.0106
  • Berkers, H. A., Rispens, S., & Le Blanc, P. M. (2023). The role of robotization in work design: A comparative case study among logistic warehouses. The International Journal of Human Resource Management, 34(9), 1852–1875. https://doi.org/10.1080/09585192.2022.2043925
  • Burrell, J. (2016). How the machine ‘thinks’: Understanding opacity in machine learning algorithms. Big Data & Society, 3(1), 205395171562251. https://doi.org/10.1177/2053951715622512
  • Chhillar, D., & Aguilera, R. V. (2022). An eye for artificial intelligence: Insights into the governance of artificial intelligence and vision for future research. Business & Society, 61(5), 1197–1241. https://doi.org/10.1177/00076503221080959
  • Clegg, C. W. (2000). Sociotechnical principles for system design. Applied Ergonomics, 31(5), 463–477. https://doi.org/10.1016/S0003-6870(00)00009-0
  • Colquitt, J. A., Greenberg, J., & Zapata-Phelan, C. P. (2005). What is organizational justice? A historical overview. In J. Greenberg,& J. A. Colquitt (Eds.), Handbook of organizational justice. Lawrence Erlbaum Associates Publishers.
  • de Winter, J. C. F., & Dodou, D. (2014). Why the Fitts list has persisted throughout the history of function allocation. Cognition, Technology & Work, 16(1), 1–11. https://doi.org/10.1007/s10111-011-0188-1
  • Emery, F. E. (1959). Characteristics of socio-technical systems: A critical review of theories and facts about the effects of technological change on the internal structure of work organisations; with special reference to the effects of higher mechanisation and automation. Tavistock Institute of Human Relations, Human Resources Centre.
  • Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors: The Journal of the Human Factors & Ergonomics Society, 37(1), 32–64. https://doi.org/10.1518/001872095779049543
  • Endsley, M. R. (2017). From here to autonomy: Lessons learned from human–automation research. Human Factors: The Journal of the Human Factors & Ergonomics Society, 59(1), 5–27. https://doi.org/10.1177/0018720816681350
  • Endsley, M. R. (2023). Supporting human-AI teams: Transparency, explainability, and situation awareness. Computers in Human Behavior, 140, 107574. https://doi.org/10.1016/j.chb.2022.107574
  • Fitts, P. M. (1951). Human engineering for an effective air-navigation and traffic-control system. National Research Council.
  • Gasteiger, N., Hellou, M., & Ahn, H. S. (2023). Factors for personalization and localization to optimize Human–robot interaction: A literature review. International Journal of Social Robotics, 15(4), 689–701. https://doi.org/10.1007/s12369-021-00811-8
  • Grote, G. (2023). Shaping the development and use of artificial intelligence: How human factors and ergonomics expertise can become more pertinent. Ergonomics, 66(11), 1702–1710. https://doi.org/10.1080/00140139.2023.2278408
  • Grote, G., Ryser, C., Wāler, T., Windischer, A., & Weik, S. (2000). KOMPASS: A method for complementary function allocation in automated work systems. International Journal of Human-Computer Studies, 52(2), 267–287. https://doi.org/10.1006/ijhc.1999.0289
  • Hacker, W. (2003). Action regulation theory: A practical tool for the design of modern work processes? European Journal of Work and Organizational Psychology, 12(2), 105–130. https://doi.org/10.1080/13594320344000075
  • Hacker, W. (2022). Arbeitsgestaltung bei Digitalisierung: Merkmale menschzentrierter Gestaltung informationsverarbeitender Erwerbsarbeit. Zeitschrift Für Arbeitswissenschaft, 76(1), 90–98. https://doi.org/10.1007/s41449-022-00302-0
  • Hancock, P. A., & Scallen, S. F. (1996). The future of function allocation. Ergonomics in Design: The Quarterly of Human Factors Applications, 4(4), 24–29. https://doi.org/10.1177/106480469600400406
  • Hoff, K. A., & Bashir, M. (2015). Trust in automation: Integrating empirical evidence on factors that influence trust. Human Factors: The Journal of the Human Factors & Ergonomics Society, 57(3), 407–434. https://doi.org/10.1177/0018720814547570
  • Holman, D. (2013). Job types and job quality in Europe. Human Relations, 66(4), 475–502. https://doi.org/10.1177/0018726712456407
  • Humphrey, S. E., Nahrgang, J. D., & Morgeson, F. P. (2007). Integrating motivational, social, and contextual work design features: A meta-analytic summary and theoretical extension of the work design literature. The Journal of Applied Psychology, 92(5), 1332–1356. https://doi.org/10.1037/0021-9010.92.5.1332
  • Johnson, L. F., Smith, R. S., Smythe, J. T., & Varon, R. K. (2009). Challenge-based learning: An approach for our time. The New Media Consortium. https://www.learntechlib.org/p/182083/
  • Kaber, D. B., & Endsley, M. R. (2004). The effects of level of automation and adaptive automation on human performance, situation awareness and workload in a dynamic control task. Theoretical Issues in Ergonomics Science, 5(2), 113–153. https://doi.org/10.1080/1463922021000054335
  • Keegan, A., Brandl, J., & Aust, I. (2019). Handling tensions in human resource management: Insights from paradox theory. German Journal of Human Resource Management: Zeitschrift für Personalforschung, 33(2), 79–95. https://doi.org/10.1177/2397002218810312
  • Landers, R. N., & Marin, S. (2021). Theory and technology in organizational psychology: A review of technology integration paradigms and their effects on the validity of theory. Annual Review of Organizational Psychology and Organizational Behavior, 8(1), 235–258. https://doi.org/10.1146/annurev-orgpsych-012420-060843
  • Langer, M., & König, C. J. (2021). Introducing a multi-stakeholder perspective on opacity, transparency and strategies to reduce opacity in algorithm-based human resource management. Human Resource Management Review, 33(1), 100881. https://doi.org/10.1016/j.hrmr.2021.100881
  • Langer, M., & König, C. J. (2023). Introducing a multi-stakeholder perspective on opacity, transparency and strategies to reduce opacity in algorithm-based human resource management. Human Resource Management Review, 33(1), 100881. https://doi.org/10.1016/j.hrmr.2021.100881
  • Langer, M., König, C. J., & Busch, V. (2021). Changing the means of managerial work: Effects of automated decision support systems on personnel selection tasks. Journal of Business and Psychology, 36(5), 751–769. https://doi.org/10.1007/s10869-020-09711-6
  • Langer, M., & Landers, R. N. (2021). The future of artificial intelligence at work: A review on effects of decision automation and augmentation on workers targeted by algorithms and third-party observers. Computers in Human Behavior, 123, 106878. https://doi.org/10.1016/j.chb.2021.106878
  • Langer, M., Oster, D., Speith, T., Hermanns, H., Kästner, L., Schmidt, E., Sesing, A., & Baum, K. (2021). What do we want from explainable artificial intelligence (XAI)? – a stakeholder perspective on XAI and a conceptual model guiding interdisciplinary XAI research. Artificial Intelligence, 296, 103473. https://doi.org/10.1016/j.artint.2021.103473
  • Larson, L., & DeChurch, L. (2020). Leading teams in the digital age: Four perspectives on technology and what they mean for leading teams. The Leadership Quarterly, 31(1), 1–18. https://doi.org/10.1016/j.leaqua.2019.101377
  • Lee, J. D., & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors: The Journal of the Human Factors & Ergonomics Society, 46(1), 50–80. https://doi.org/10.1518/hfes.46.1.50.30392
  • Makarius, E. E., Mukherjee, D., Fox, J. D., & Fox, A. K. (2020). Rising with the machines: A sociotechnical framework for bringing artificial intelligence into the organization. Journal of Business Research, 120, 262–273. https://doi.org/10.1016/j.jbusres.2020.07.045
  • Matza, L. S., Stewart, K. D., Lloyd, A. J., Rowen, D., & Brazier, J. E. (2021). Vignette-Based Utilities: Usefulness, Limitations, and Methodological Recommendations. Value in Health, 24(6), 812–821. https://doi.org/10.1016/j.jval.2020.12.017
  • McBride, S. E., Rogers, W. A., & Fisk, A. D. (2014). Understanding human management of automation errors. Theoretical Issues in Ergonomics Science, 15(6), 545–577. https://doi.org/10.1080/1463922X.2013.817625
  • Mitchell, S., Potash, E., Barocas, S., D’Amour, A., & Lum, K. (2021). Algorithmic fairness: Choices, assumptions, and definitions. Annual Review of Statistics and Its Application, 8(1), 141–163. https://doi.org/10.1146/annurev-statistics-042720-125902
  • Morgeson, F. P., & Campion, M. A. (2003). Work Design. In I. B. Weiner (Ed.), Handbook of psychology (1st ed., pp. 423–452). Wiley. https://doi.org/10.1002/0471264385.wei1217
  • Mosier, K. L., Skitka, L. J., Burdick, M. D., & Heers, S. T. (1996). Automation Bias, Accountability, and Verification Behaviors. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 40(4), 204–208. https://doi.org/10.1177/154193129604000413
  • Mumford, E. (1983). Participative systems design: Practice and – ProQuest. Journal of Occupational Behaviour, 4(1), 47–57.
  • Nelson, A. J., & Irwin, J. (2014). ‘Defining what we do – all over again’: Occupational identity, technological change, and the Librarian/Internet-search relationship. The Academy of Management Journal, 57(3), 892–928. https://doi.org/10.5465/amj.2012.0201
  • Norcross, J. C., Hailstorks, R., Aiken, L. S., Pfund, R. A., Stamm, K. E., & Christidis, P. (2016). Undergraduate study in psychology: Curriculum and assessment. American Psychologist, 71(2), 89–101. https://doi.org/10.1037/a0040095
  • Parasuraman, R., & Manzey, D. H. (2010). Complacency and bias in human use of automation: An attentional integration. Human Factors: The Journal of the Human Factors & Ergonomics Society, 52(3), 381–410. https://doi.org/10.1177/0018720810376055
  • Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2000). A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man, and Cybernetics – Part A: Systems and Humans, 30(3), 286–297. https://doi.org/10.1109/3468.844354
  • Parker, S. K., Andrei, D. M., & Van den Broeck, A. (2019). Poor work design begets poor work design: Capacity and willingness antecedents of individual work design behavior. The Journal of Applied Psychology, 104(7), 907–928. https://doi.org/10.1037/apl0000383
  • Parker, S. K., & Grote, G. (2022). Automation, algorithms, and beyond: Why work design matters more than ever in a digital world. Applied Psychology, 71(4), 1171–1204. https://doi.org/10.1111/apps.12241
  • Parker, S. K., Morgeson, F. P., & Johns, G. (2017). One hundred years of work design research: Looking back and looking forward. The Journal of Applied Psychology, 102(3), 403–420. https://doi.org/10.1037/apl0000106
  • Parker, S. K., Van Den Broeck, A., & Holman, D. (2017). Work design influences: A synthesis of multilevel factors that affect the design of jobs. Academy of Management Annals, 11(1), 267–308. https://doi.org/10.5465/annals.2014.0054
  • Phillips, E., Ososky, S., Grove, J., & Jentsch, F. (2011). From tools to teammates: Toward the development of appropriate mental models for intelligent robots. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 55(1), 1491–1495. https://doi.org/10.1177/1071181311551310
  • Reverberi, C., Rigon, T., Solari, A., Hassan, C., Cherubini, P., & Cherubini, A. (2022). Experimental evidence of effective human–AI collaboration in medical decision-making. Scientific Reports, 12(1), 14952. https://doi.org/10.1038/s41598-022-18751-2
  • Richter, A., Heinrich, P., Stocker, A., & Schwabe, G. (2018). Digital work design: The interplay of human and computer in future work practices as an interdisciplinary (grand) challenge. Business & Information Systems Engineering, 60(3), 259–264. https://doi.org/10.1007/s12599-018-0534-4
  • Seeber, I., Waizenegger, L., Seidel, S., Morana, S., Benbasat, I., & Lowry, P. B. (2020). Collaborating with technology-based autonomous agents: Issues and research opportunities. Internet Research, 30(1), 1–18. https://doi.org/10.1108/INTR-12-2019-0503
  • Simon, H. A. (1996). The sciences of the artificial (third edition ed.). MIT Press.
  • Strohmeier, S. (2009). Concepts of e-HRM consequences: A categorisation, review and suggestion. The International Journal of Human Resource Management, 20(3), 528–543. https://doi.org/10.1080/09585190802707292
  • Symon, G., & Clegg, C. (2005). Constructing identity and participation during technological change. Human Relations, 58(9), 1141–1166. https://doi.org/10.1177/0018726705058941
  • Tenenbaum, L. M. (2022, December 1). Introducing High School Students to Human Factors Engineering. https://www.apa.org/ed/precollege/psychology-teacher-network/introductory-psychology/human-factors-engineering
  • Trist, E. L. (1981). The evolution of socio-technical systems: A conceptual framework and an action research program. Ontario Ministry of Labour, Ontario Quality of Working Life Centre.
  • Trist, E. L., & Bamforth, K. W. (1951). Some social and psychological consequences of the longwall method of coal-getting: An examination of the psychological situation and defences of a work group in relation to the social structure and technological content of the work system. Human Relations, 4(1), 3–38. https://doi.org/10.1177/001872675100400101
  • Tyre, M. J., & Orlikowski, W. J. (1994). Windows of opportunity: Temporal patterns of technological adaptation in organizations. Organization Science, 5(1), 98–118. https://doi.org/10.1287/orsc.5.1.98
  • Ulfert, A.-S., Antoni, C. H., & Ellwart, T. (2022). The role of agent autonomy in using decision support systems at work. Computers in Human Behavior, 126, 106987. https://doi.org/10.1016/j.chb.2021.106987
  • Ulfert, A.-S., Georganta, E., Centeio Jorge, C., Mehrotra, S., & Tielman, M. L. (2023). Shaping a multidisciplinary understanding of team trust in human-AI teams: A theoretical framework. European Journal of Work and Organizational Psychology, 1–14. https://doi.org/10.1080/1359432X.2023.2200172
  • van Zoelen, E. M., van den Bosch, K., & Neerincx, M. (2021). Becoming team members: Identifying interaction patterns of mutual adaptation for human-robot Co-learning. Frontiers in Robotics and AI, 8, 8. https://doi.org/10.3389/frobt.2021.692811
  • Wächter, H., Modrow-Thiel, B., & Roßmann, G. (1989). Prospektive Arbeitsgestaltung – Das Verfahren ATAA. German Journal of Human Resource Management: Zeitschrift für Personalforschung, 3(4), 277–296. https://doi.org/10.1177/239700228900300402
  • Waterson, P. E., Older Gray, M. T., & Clegg, C. W. (2002). A sociotechnical method for designing work systems. Human Factors: The Journal of the Human Factors & Ergonomics Society, 44(3), 376–391. https://doi.org/10.1518/0018720024497628
  • Wolffgramm, M., Tijink Saxion, T., Disberg- Van Geloven, M., & Corporaal, S. (2021). A collaborative robot in the classroom: Designing 21st Century engineering education together. Education Theory and Practice, 21(16), 177–187.