1,649
Views
0
CrossRef citations to date
0
Altmetric
Editorial preface article

The changing nature of teaching future IS professionals in the era of generative AI

, &

AI is not magic anymore. The development, use, and application of innovative digital technologies, such as generative artificial intelligence (GAI) and its specific subset of large language models (LLMs) that power technological products, e.g. ChatGPT and other chatbots, present a disruptive and revolutionary potential that carries significant implications for individuals, businesses, and society (Dwivedi et al., Citation2023; Nah et al., Citation2023). Such implications are not evident only for the way we live and work, but also for the way we educate our future generations. In the era of technologies like ChatGPT, the question arises: How do we teach with e.g. ChatGPT and what is the difference and potential implication between an answer and the right answer? Given this context, the spotlight has frequently been on teaching students how to foster resilience (Stallman, Citation2009). But curriculum guidelines have been overlooked in their ability to stand the test of time.

In this editorial, we emphasize the significance of shifting our focus toward vocalizing the need for regularly updated Information Systems (IS) curriculum guidelines (i.e. IS 2020 by Leidig and Salmela (Citation2020)), as a pivotal aspect in understanding IS education, its evolution, and prospect in this new technological landscape. Moreover, to better understand the potential devolution; will higher IS education go back to pen and paper when chatbots enter the classroom? This arises from the presence of gaps and uncertainties when it comes to successful and responsible integration of, e.g., LLMs into learning and teaching processes (Kasneci et al., Citation2023). Consequently, from our perspective, regularly updated IS curriculum guidelines embody a dynamic quality that consistently guarantees the relevance of an IS curriculum in the face of the continually shifting landscape of industrial demands, trends, and technology-driven digital transformations, particularly affecting IS curriculum guidelines.

The arrival of ChatGPT is already leading significant transformations through automation and digital immersion across all facets of life, including the domain of education (Dwivedi et al., Citation2023). These changes hold the potential for profound implications for future generations, pushing us to envision the true essence of an IS education and its curriculum guidelines.

The need for a historical perspective

To gain a more profound understanding of this subject, it is imperative to delve into IS curriculum historical roots, tracing it back to the 1970s (Leidig & Salmela, Citation2020). This historical perspective not only helps in understanding the foundations of IS education and the semi-abstract artifact, i.e. IS, but also the significance of future IS education. It allows us to place particular emphasis on undergraduates, the newcomers who will play a key role in the future application of IS in both academia and industry, equally.

The Association for Information Systems and the Association for Computing Machinery (AIS/ACM) taskforce has played a crucial role in shaping IS curriculum guidelines over decades, serving as a cornerstone for the development and maintenance of local IS education programmes, worldwide. In retrospective, it becomes apparent that the current IS 2020 curriculum guidelines mark a new beginning for the future of IS education as detailed by Leidig and Salmela (Citation2020), and that the IS 2010 curriculum guidelines, as detailed by Topi et al. (Citation2010), marked a notable departure from their predecessor, the IS 2002 curriculum guidelines. A more detailed historical overview on this is penned by Leidig and Salmela (Citation2020, p. 7). Consequently, iterations of IS curriculum guidelines have been adopted and applied across universities worldwide since the early 1970s.

The most recent iteration, that of the IS 2020, represents a significant leap forward in comprehending the dynamism of IS within the ever-evolving technological landscape, impacting individuals, businesses, and society. The IS 2020 curriculum guidelines (Leidig and Salmela, Citation2020), which pinpoint the multi-disciplinarity of the IS field and constant technological development, deployment opportunities, and trends, challenge the “IS faculty to design a curriculum that adequately addresses the needs of future generations of IS professionals” (Leidig and Salmela, Citation2020, p.24). While the IS 2020 clearly differentiates from IS 2010 on foreseeing the future of IS curriculum with challenges lying ahead, the very discussion of IS 2020 on “what to learn” raises the concern in today’s technological environment. The IS 2020 oversights the part on “how to learn it,” and emphasizes the competencies and the rapidly changing way of learning, with online learning taking the focus (e.g. MOOCs).

It is safe to say that since the launch of IS 2020, we see that the technological spaces have evolved profoundly in just 3 years. Therefore, AI is not magic anymore, technology is developing at an astounding rate, and the need to explore the next frontier is inborn in humanity leading to digital immersion. Viewing digital technologies with a possibility to drive innovation in ways previously unimagined, we are now also leaning more toward viewing them as technologies that pose significant challenges for humans, in all spheres of life, not least education.

The view on defining competence as “[.]the graduate’s ability to apply knowledge, skills, and dispositions[.]” (Leidig and Salmela, Citation2020, p.35) in a new technological landscape, where, e.g., ChatGPT, among others, affects that definition to a great extent and raises ethical concerns. Firat (Citation2023) provides a detailed review on how others have already studied pros and cons of using ChatGPT in education, with contradictory views on the role that ChatGPT plays in “personalized learning” to its role in “learning with misinformation, bias, and privacy issues”. Dwivedi et al. (Citation2023) also highlight these matters comprehensively. Firat (Citation2023) suggests one way to address these contradictions is to prioritize curricula and pedagogical approaches that better address the capabilities of AI.

To teach students how to foster resilience and for the curricula to stand the test of time and evolve to become more resilient, curriculum design must develop and implement incentives for responsible use of GAI in higher IS education.

Challenges ahead

We prompted ChatGPT with a question that might intrigue our IS community: “How often should educational curriculum guidelines be updated?.” While the answer is somewhat lengthy, the very first guideline (10 guidelines in total) suggests the following: “Industry and Technological Changes: If the field is subject to rapid changes, such as technology or healthcare, guidelines may need updates every 2–5 years or even more frequently to reflect the latest knowledge and industry practices.” Typically, and since the dot-com bust, updates to IS curriculum guidelines have occurred with a significant lag (Kajtazi & Holmberg, Citation2019), often spanning a decade (e.g. IS 2010–IS 2020), and in the best-case scenario, around 8 years (e.g. IS 2002–IS 2010). We are optimistic that this pattern will undergo a substantial reduction, potentially halving or more the completion of an updated IS curriculum guideline expected to be introduced within the next years.

While ChatGPT cannot provide references to the 10 guidelines it presents (including the first one italicized and quoted above), a straightforward Google search on that statement yields several potential references, including the recent and prominently discussed article of Dwivedi et al. (Citation2023), which appears as the fourth listed link, from the database of ScienceDirect. As IS educators, we approach this answer with scrutiny, recognizing the gravity of it from an ethical standpoint. Yet, recent advancements in technology with ChatGPT urge us to consider that if our community shall be waiting for an IS 2030 curriculum guideline as per the current trajectory of expecting new IS curriculum guidelines, it might be a delay too long for enhancing “what” and “how” future generations learn. This concern is well documented in Dwivedi et al. (Citation2023, p. 26) where data quality bias, interpreting and understanding the model’s output, privacy and security, limited explanation capability, human-computer interaction, and ethical concerns are listed as key challenges of using GAI in education. But, let us understand the IS 2020 curriculum guidelines with enough resilience to capture the fast paced evolving technological landscape. Nonetheless, we see that IS 2020 falls short in addressing the “how-to” especially in the light of the panic across academia on how to assess “how students learn” from now on.

Students of higher education are increasingly relying on ChatGPT for their learning, making it challenging to assess precisely “how they learn,” therefore we considered it necessary to start this editorial with the question of: “will higher IS education go back to pen and paper when chatbots enter the classroom?,” and stress the significance of developing and implementing incentives for responsible use of GAI in higher IS education curricula.

Disclosing early concerns before ChatGPT era

Teaching practices, including how teachers use various systems to check for, e.g.,, plagiarism are already outdated. A recent study by Khalil and Er (Citation2023) proved that existing popular plagiarism systems, like iThenticate and Turnitin, can entirely overlook essays written with ChatGPT. As a result, institutions are now ramping up efforts to create better templates for “author contribution statements” for students, inspired by such statements already existing and with recently updated editorial policies on websites of academic publishers, like Science or Elsevier. Such contribution statement can be viewed as a push for responsible use of GAI.

Previous results from an article of ours, written for the Information Systems Education Conference, known as an important forum for information systems educators and professionals, paired with our arguments above, bring us back to resilience. While our article (Kajtazi et al., Citation2020) is almost 3 years behind to catch with the recent developments in the field of GAI and LLMs, already then, our mapping study identified four crucial contingencies that directly affect institutional IS curricula, namely: (i) dangers of legacy; (ii) resource competence; (iii) technological availability; and (iv) trend sensitivity.

Shortly, the four contingencies represent the following:

  1. Dangers of Legacy: Pride in legacy is driven by the fact that an outdated curriculum can lead to more abstractions than necessary in the class, for example business-driven concepts that no longer are key phenomena in the IS field can lose credibility in class, and too technical-oriented classes that drive students to suspect that they need a computer science background to be ready for an IS program.

  2. Resource Competence: Frequent changes in courses often lead to major revisions where resource competence becomes a problem. A lecturer is not capable to develop knowledge within a new, e.g., technology in a short notice and would be far from developing expertise to teach that technology with its full capacity.

  3. Technological Availability: With frequent revisions within IS courses and programs, technological skills become mandatory to keep-up-to-date in profiling the students toward the right trends, however, IS programs often cannot keep-up-to-date with such trends.

  4. Trend Sensitivity: Fast moving markets show that the industry can often move toward a direction that finds our students not compatible with their immediate competence needs.

When considering these four contingencies, it became apparent for us as early as 2018, the IS curriculum guidelines were notably outdated and progressing at a slow pace. Consequently, we continued exploring these four contingencies, placing significant importance on the perspectives presented by Clark et al. (Citation2017) on a similar fashion for the IS 2010 curriculum, as we articulated the following: “The AIS/ACM task force has been criticized for keeping a modest pace in the IS undergraduate curriculum 2010 development (Clark et al., Citation2017). It has been almost a decade since the first design of the current approved IS curriculum guidelines was presented (starting in 2009) letting us adhere to guidelines not always applicable for designing a contemporary IS program” where we continued to reason for the need for a much earlier IS curriculum guideline by stating: “Our aim is to inform the task force that there is an urge to bring the IS202× much earlier than predicted” (Kajtazi et al., Citation2020, p. 77). At the time we were writing the article (i.e. Kajtazi et al. (Citation2020)), it was not immediately clear if the IS curriculum guidelines would be readily published in 2020.

The need to speed-up the modest pace of IS curriculum guideline revisions

IS, has since its inception been a rapidly changing field. IS professionals and scholars have learned how to adapt, stay up to date, and innovate in the constant of change (Clark et al., Citation2017). We have not witnessed a rapid change similar to the development and diffusion of GAI and LLMs before. In 1 week, ChatGPT reached 1 million usersFootnote1; that number rose exponentially within a few months to 100 million users, urging academics around the world, and not least our IS community, to panic and re-think the way we teach and examine our students, now quickly turning into ad-hoc procedures. In a recent editorial in Science magazine titled “ChatGPT is fun, but not an author,” Thorp (Citation2023) recognizes it as an “endless entertainment” tool, that is too immature and unreliable to pen with you as an author, yet again, its implications in education are pushing academics to re-think their course examination methods in innovative ways, by giving assignments that ChatGPT cannot easily solve.

Based on this, we foresee that the use of essays as examinations is going to have a clear decline. This is unfortunate and challenging, as the future generations learn a lot by writing an essay, which helps them develop their critical thinking skills; thus, we cannot simply abolish essays because of e.g., ChatGPT. Perhaps writing an essay will soon again take a very traditional format, bringing us back to point zero with plagiarism in sight? If so, then a traditional essay often meant that students spent many hours in a classroom, with no devices at hand, and even no books at hand. Such worries have a history. The introduction of calculators in the classroom half a century ago was considered a potential threat to the learning process (Holmes and Tuomi, Citation2022). From an ethical perspective, there is still a scarcity of studies to give a deeper understanding of how ChatGPT can lead to positive transformational changes, also considering the raise of positive teaching and learning. By ethically considering social justice, individual autonomy, cultural identity, and the environment in the light of Stahl and Eke (Citation2024), the question about responsible use of GAI again becomes relevant. These four concerns are heavily debated by prominent scientific and philosophical communities, including Nick Boström and Max Tegmark, who currently lead the discourse on how the rise of GAI is an existential risk to humanity, not at least from the four crucial considerations to differentiate us from self-improvement machines.

While we may not have the answers regarding how to effectively incorporate technologies like GAI and LLM tools in teaching, which are omnipresent in the devices of both students and academics, we strongly advice our IS community to recognize our responsibility in accelerating the formulation of updated IS curriculum guidelines, overcoming the typical trajectory of every 10 years. These guidelines must not only comprehensively address the “what” aspect – defining what we should teach future generations but also the “how” aspect – describing how we should potentially educate our future generations, including responsible use of GAI, by helping the IS community to bring forward new and innovative teaching methods. Perhaps, responsible use of GAI will be more effective when new regulations, such as the European Union AI Act, come into force.

Disclosure statement

No potential conflict of interest was reported by the authors.

Correction Statement

This article has been corrected with minor changes. These changes do not impact the academic content of the article.

Additional information

Notes on contributors

Miranda Kajtazi

Miranda Kajtazi is an Associate Professor at the Informatics department at Lund University School of Economics and Management. Her research encompasses a range of topics, including information security, privacy, data-driven business models, and digital inequalities. Miranda’s scholarly work has been featured in Information & Computer Security, Journal of TripleC; Transactions on Replication Research, among others. She is currently leading a research project on Human Rights and Digital Inequalities, and actively participates in various school boards, particularly dedicated to internationalization and equality efforts. Her involvement includes initiatives aimed at enhancing programme and curriculum development in the long run.

Nicklas Holmberg

Nicklas Holmberg Ph.D. and Senior Lecturer, is the Head of Department at the Department of Informatics at Lund University School of Economics and Management. Nicklas holds a Ph.D., in Information Systems and is currently a Senior Associate at MIT Research School, eGovernment Member, IBM GWC/WUG, Client Reference and Customer Partner Program Member, Smarter Planet Reference and a Microsoft Certified Technology Specialist (MCTS). Nicklas’ research is focused on Business Architecture, Business Decision Automation and Business Process Service Orientation and is conducted in the health care and banking sector.

Saonee Sarker

Saonee Sarker is the Richard E. Sorensen Dean of the Pamplin College of Business, Virginia Tech. She is also Visiting Professor at the London School of Economics and was previously at Lund University, Sweden and University of Virginia. Her research interests include smart infrastructure and sustainability, healthcare information technology and technostress, technology-enabled collaboration, and has been published in MIS Quarterly, Information Systems Research, Journal of Management Information Systems, among others. She is the director of diversity, equity, and inclusion (DEI) for MIS Quarterly, and senior editor emeritus for the journal.

Notes

References

  • Clark, J., Clark, C., Gambill, S., & Brooks, S. (2017). Is curriculum models, course offerings, and other academic Myths/Hopes. Journal of Higher Education Theory and Practice, 17(9), 61–68.
  • Dwivedi, Y. K., Kshetri, N., Hughes, L., Slade, E. L., Jeyaraj, A., Kar, A. K., Baabdullah, A. M., Koohang, A., Raghavan, V., Ahuja, M., Albanna, H., Albashrawi, M. A., Al-Busaidi, A. S., Balakrishnan, J., Barlette, Y., Basu, S., Bose, I., Brooks, L. … Wirtz, J. (2023). Opinion paper: “so what if ChatGPT wrote it?” multidisciplinary perspectives on opportunities, challenges and implications of generative conversational AI for research, practice and policy. International Journal of Information Management, 71, 102642. https://doi.org/10.1016/j.ijinfomgt.2023.102642
  • Firat, M. (2023). What ChatGPT means for universities: Perceptions of scholars and students. Journal of Applied Learning & Teaching, 6(1). https://doi.org/10.37074/jalt.2023.6.1.22. Advance Online Publication.
  • Holmes, W., & Tuomi, I. (2022). State of the art and practice in AI in education. European Journal of Education, 57(4), 542–570. https://doi.org/10.1111/ejed.12533
  • Kajtazi, M., & Holmberg, N. (2019). IS education revisited: Reflections on a BSc program in business Information Systems design. International Conference on Information Management, Cambridge, UK. IEEE.
  • Kajtazi, M., Holmberg, N., & Sarker, S. (2020). Insights for next generation undergraduate is curriculum developers. Information Systems Education Conference (ISECON), Plano, Texas.
  • Kasneci, E., Seßler, K. K., Bannert, S., Dementieva, M., Fischer, D., Gasser, F., Groh, U., Günnemann, G., Hüllermeier, S., Krusche, E., Krusche, S., Kutyniok, G., Michaeli, T., Nerdel, C., Pfeffer, J., Poquet, O., Sailer, M., Schmidt, A. … Kuhn, J. (2023). ChatGPT for good? On opportunities and challenges of large language models for education. Learning and Individual Differences, 103, 102274. https://doi.org/10.1016/j.lindif.2023.102274
  • Khalil, M., & Er, E. (2023). Will ChatGPT get you caught? Rethinking of Plagiarism Detection. arXiv, 2302.04335. https://doi.org/10.48550/arXiv.2302.04335
  • Leidig, P., & Salmela, H. (2020). IS2020: A competency model for undergraduate programs in information systems. The Joint Report by ACM/AIS IS 2020 Task Force.
  • Nah, F.-H. F., Zheng, R., Cai, J., Siau, K., & Chen, L. (2023). Generative AI and ChatGPT: Applications, challenges, and AI-human collaboration. Journal of Information Technology Case & Application Research, 25(3), 277–304. https://doi.org/10.1080/15228053.2023.2233814
  • Stahl, B. C., & Eke, D. (2024). The ethics of ChatGPT – exploring the ethical issues of an emerging technology. International Journal of Information Management, 74, 102700. https://doi.org/10.1016/j.ijinfomgt.2023.102700
  • Stallman, H. M. (2009). Embedding resilience within the tertiary curriculum: A feasibility study. Higher Education Research & Development, 30(2), 121–133. https://doi.org/10.1080/07294360.2010.509763
  • Thorp, H. H. (2023). ChatGPT is fun, but not an author. Science, 379(6630), 313. https://doi.org/10.1126/science.adg7879
  • Topi, H., Valacich, J. S., Wright, R. T., Kaiser, K. M., Nunameker, J. J. F., Sipior, J. C., & De Vreede, G. J. (2010). IS2010: Curriculum Guidelines for Undergraduate Degree Programs. The Joint Report by ACM/AIS IS 2010 Task Force.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.