1,760
Views
1
CrossRef citations to date
0
Altmetric
Editorial

Artificial intelligence (AI), conversational agents, and generative AI: implications for adult education practice and research

The AI era is upon us, and it is still within our power to ensure it brings prosperity for all.

(Kristalina Georgieva, International Monetary Fund)

Coined by McCarthy et al. (Citation1955), the term ‘artificial intelligence’ (AI) refers to the ability of a digital machine to emulate human cognition and decision-making. Ever since AI has transformed production systems and other jobs and non-job-related activities. Meanwhile, a new research area has emerged: AI in Education (AIED), which studies how teaching and learning practices and program development may ‘benefit’ from applying AI technologies like intelligent tutoring systems, chatbots, and automated assessment. AIED is increasingly dealing with the ethical dimensions of education, yet independently from philosophical, psychological or moral concerns (Mouta et al., Citation2023). Today, AI permeates many professional and non-professional areas, impacting how people interact, learn, work, and live. Adults are thus expected to ‘collaborate and cooperate’ with AI across professional and non-professional contexts (Laupichler et al., Citation2022, p. 2).

A previous editorial (Holford et al., Citation2019) questioned AI’s use and potential benefits to policymaking in adult education at a time when the 2019 Council Recommendation on Artificial Intelligence of the Organisation for Economic Cooperation and Development promoted a set of principles among its member countries for a human-centric and trustworthy AI (https://legalinstruments.oecd.org/). Five years later, significant advancements in AI technologies and natural language processing (NLP) have transformed people’s daily lives, including those involved in teaching and learning within adult education. The public release of ChatGPT (Generative Pre-trained Transformer) has boosted the use of NLP in several professions and teaching and learning processes, especially in secondary school and post-compulsory education (Gimpel et al., Citation2023; Laupichler et al., Citation2022). The novelty of ChatGPT is its use of NLP to mimic human conversation and create seemingly new texts in various genres – e.g. journalistic, academic, poetic. Several countries – e.g. USA, China, Germany – have set up national strategies to integrate AI into education and support the training of young people and adults alike in AI and its use, not least by sponsoring initiatives by private companies and other stakeholders, like the free course Elements of AI (https://www.elementsofai.com) developed by the University of Helsinki and MinnaLearn, an online learning company. Philanthropic entrepreneurs, like the Jacobs Foundation, have devolved large amounts of funds to research centres devoted to helping people learn, live, and work in the age of AI, including sponsoring the Center for Learning and Living with AI (CELLA), in cooperation with the University of Oulu, Finland, and Radboud University, the Netherlands. Across countries, several public and private companies are increasingly using AI technologies to bolster and support internal non-formal learning and employee training. Yet relatively little research attention has been given to AI use beyond K12 and university (see among others: Rawas, Citation2023; Sanusi et al., Citation2023). In this editorial, we dig into some of the AIED literature and beyond to stimulate reflection on a few aspects central to adult educators and researchers: What are the potential and limits of using AI technologies to support teaching and learning processes? What knowledge should adults acquire to avoid being left behind in their studies, profession, and other daily life areas? Can adult education researchers also benefit from the use of AI technologies? What ethical concerns should adult educators and researchers have in using AI technologies for their profession? To do so, we first elucidate what we talk about when we speak of AI technologies and what is known about the use of conversational agents in education before centring attention on the relationships between AI technologies and adults’ lives, education, and learning. Finally, we briefly consider using AI technologies for research.

AI technologies and conversational agents

Following a nested approach to conceptualising AI technologies, Gimpel et al. (Citation2023) position ‘AI’ at a higher abstract level, encompassing all technologies to make digital devices capable of acting in an environment. Then, Machine learning (ML) allows digital devices to learn and improve on a specific task by identifying patterns and predictions based on algorithms applied to input data. Generative AI, typically using ML, can generate new data or outputs as text, images, or music, whereas a Large Language Model (LLM) is a specific ML model capable of processing and generating natural language text.

Generative AI technologies have introduced far-reaching changes in how humans interact with digital machines. For instance, chatbots are software capable of mimicking human conversation through text or voice interaction that can use ML or be powered without a ML component. While their emergence dates back in history, their evolution and sophistication, thanks to advances in ML and NLP, have increased extensively in recent years, exemplified by the release of ChatGPT (November 2022) developed by Open AI, and Google Bard (May 2023) developed by Google AI. Both chatbots represent generative AI as they can enter natural language conversations with people based on LLMs to generate text using human-like language and style, by application of algorithmic patterns to learn from a large amount of input data. The difference is that ChatGPT is trained on a data set that includes books and articles, increasing its potential for accuracy, while Google Bard is trained on a dataset that includes texts from the internet, which increases its potential to be up to date (Labadze et al. Citation2023).

Today, conversational agents are being applied across sectors, including education. Most chatbots are web-based platforms that are easily available and permanently accessible from multiple locations. Most adults interact with chatbots in their everyday lives when asking for assistance from a mobile or internet provider and so on. However, using more sophisticated conversational agents like ChatGPT or Bard is usually a conscious and deliberate choice.

Conversational agents have attracted attention from educationalists and researchers across disciplines for their possibilities and risks to teachers, learners, and professionals in various fields, and an ample body of literature now exists dedicated to their role, opportunities, and challenges in certain sectors (Labadze et al., Citation2023; Tlili et al., Citation2023).

Against this backdrop, the sophistication of AI technologies, specifically conversational agents, is expected to increase with time. Some people fear they will replace humans, but others insist they display limitations because they are ‘trained’ with finite, albeit large, data sets. Nonetheless, these technologies have limits and present risks that people should be aware of when conversing with them. For instance, everything in AI technologies is either pre-set or based on a limited concatenation of data and information humans have previously said and written without the non-expert users – i.e. people without a background or specialised profession in computer science, AI engineering or the like – being aware of the methodology these technologies use to do the concatenation behind the generation of their outputs in the form of natural language text. Generative AI technologies also lack the circumspection we assume in human actors, and their outputs may contain false or meaningful information as much as ‘hallucinations’ – e.g. when the AI technology has learned an incorrect pattern based on incomplete or biased training data (Gimpel et al., Citation2023).

Conversational agents’ use in education

Views on using conversational agents in education are polarised. On the one hand, they are considered of great help, especially for people with language problems such as dyslexia. On the other hand, several believe they should be used with caution and parsimony. For instance, in the public discourse on the use of AI in education depicted by Twitter (now X) exchanges, Tlili et al. (Citation2023) point to both positive and negative sentiments but even if positive sentiments outweigh negative ones, negative sentiments are usually deeper, based on critical thinking, and address reasons to approach conversational agents like ChatGPT with caution.

Research with teachers and students using chatbots and more sophisticated conversational agents brings to light, on the positive side, that students appreciate their use as study assistants (Labadze et al., Citation2023), as they provide basic knowledge on various topics by making complex topics easy to understand, with a good degree of accuracy in the information they provide, and are easy and fun to interact with (Tlili et al., Citation2023). Moreover, they can support learning and develop skills by providing ‘personalised’ feedback to enhance writing skills, offering syntactic and grammatical corrections, and facilitating debate – e.g. when they suggest discussion structures (Labadze et al., Citation2023). Teachers appreciate chatbots as time-saving assistants with routine tasks like scheduling or grading and to improve their pedagogy – for example by helping with personalised support to students or tailoring content to different students’ needs (Labadze et al., Citation2023). On the negative side, teachers and students also recognise that critical evaluation of what AI technologies produce is important. For instance, information from ChatGPT is limited (timewise) and occasionally fallacious; ‘conversations’ with it lack recognition of emotions and other communicative clues typical of human-to-human interactions, and at times, provide contradictory answers to queries on the same topic (Tlili et al., Citation2023). Moreover, using chatbots and more sophisticated conversational agents may prevent engaging deeply with a topic and limit rather than enhance critical thinking and problem-solving (Labadze et al., Citation2023). In addition, interviews with teachers and students using ChatGPT (Tlili et al., Citation2023) brought to light several ethical concerns including plagiarism and cheating on the part of the students. They reduce critical thinking making students more prone to bias, fake information, or opinions rather than trustworthy references. Researchers also stress conversational agents’ lack of reliability and accuracy that might mislead students or hinder their learning process and raise ethical concerns about data privacy, security, and responsible use of AI technologies (Labadze et al., Citation2023). These concerns are echoed by non-expert users preoccupied with exposing their private and demographical data (Tlili et al., Citation2023).

Recently, Zirar (Citation2023) and Chiu et al. (Citation2023) reviewed the (English) literature on the pros and cons of using AI in education, suggesting that NLP may alter learning and assessment experiences for the good and the bad in different educational domains hence supporting both enthusiasts and sceptics. For instance, Zirar (Citation2023) suggests that NLP may provide information that sparks problem-solving or critical thinking among learners and can help learners and teachers produce study synthesis or teaching material. At the same time, ‘reliance on them without critical evaluation adversely impacts student learning’; thus, NLP should be parsimoniously used to ‘play a specific and defined role’ (Zirar, Citation2023, p.1) in teaching and learning transactions. Chiu and colleagues (Citation2023) identify four roles for AI to support learning among students and their relative challenges: 1) AI can help with ‘assigning tasks based on individual competence’ but is challenged by the lack of supportive learning resources to match tasks to individual competence; 2) AI can help in ‘providing human-machine conversations’, but such conversations are not free from challenges and knowledge of when and how they may improve the learning experience remains uncertain; 3) AI can help in ‘analysing student work for feedback’, but when such feedback is prepared in advance there are doubts on whether it really meets the individual needs of learners; finally 4) AI can help in ‘increasing adaptability and interactivity in digital environments’, but capturing student learning data, for instance, is mostly done with the scope of developing AI-supported digital environments rather than for studying their effects on student’s learning and experience. Chiu and colleagues (Citation2023) also found three roles assigned to AI in teaching and related challenges: 1) AI can help in ‘providing adaptive teaching strategies’ – e.g. recommending appropriate teaching content and tasks – but still lacks practical testing and criteria to judge the effectiveness of the intelligent systems employed; 2) AI can help in ‘enhancing teachers’ ability to teach’ regarding effective classroom management – e.g. AI technologies helped upload, assign, and distribute learning materials and assignments. Yet, most teachers do not understand how these technologies work; and finally, 3) AI can support ‘teacher professional development’. For instance, AI technologies have provided teachers with suggestions and feedback on their teaching based on automated data analysis; however, tips and feedback were pre-set and did not always match the teachers’ needs, especially those of more experienced teachers. Interestingly, however, of the 92 journal articles reviewed by Chiu et al. (Citation2023), only one deals with adult education – experts and managers in edutech company – whereas the striking majority (60) concerns higher education.

In higher education, Gimpel et al. (Citation2023) suggest several areas where generative AI technologies could benefit university students. These areas are equally relevant for adult learners outside higher education. First, conversational agents like ChatGPT necessitate adequate prompts to generate valuable results. Producing such prompts, as much as evaluating the quality of the results, requires adults to logically organise and categorise information coherently. Hence, these agents can help structure human learners’ thinking. Second, multiple interactions on the same topic with conversational agents can help refine the process of novel text generation and use conversational agents more instrumentally – e.g. to summarise rather than produce novel text. In short, following these authors, adults may think of conversational agents as ‘partners’ in the creation of text. This, however, implies that adults become aware that conversational agents cannot be responsible for the results they produce, and that such results may not be up-to-date, trustworthy, or accurate (Atlas, Citation2023). In fact, as mentioned, one of the biases of conversational agents is the loss of connection between the information they provide and the original sources. Therefore, the results to any prompt need adults to verify its correctness and to look for valid sources of the information they contain. Accordingly, the authors also suggest ways university staff may use generative AI technologies to support their teaching, which is helpful for adult educators, too. First, to develop critical thinking, teachers and educators may use the limitations of generative AI to encourage learners to reflect on the output generated by AI technologies. They can also invite learners to engage in iterative communications with generative AI technologies to foster reflection on the relationship between the prompts offered by the learners and the output provided by these technologies. Second, the authors argue that teachers may use AI technologies to ‘develop lecture ideas, draft plans and module descriptions, and craft announcement texts’ (Ibid. p. 29). In addition, Mollick and Mollick (Citation2022) also suggest using ChatGPT to support learners with knowledge transfer by applying acquired knowledge to different situations, raising awareness of the limitations of their knowledge, and encouraging critical thinking about the information.

AI technologies and adults’ lives, education, and learning

An increasing number of adults are making use of generative AI technologies in their study and work today (including adult teachers and educators) but are often unaware of the underlying LLMs and the limits of generative AI technologies or do not adequately consider the risks such technologies pose from an ethical point of view. At the same time, a large pocket of the adult population ignores or fears generative AI technologies and, therefore, does not benefit from the potential advantages these could bring to improve, for instance, their study or working performance. Accordingly, it is important for both adult educators and learners to acquire competencies in the field of AI, which they can use for teaching and learning as much as other professional and personal reasons. In this respect, several researchers concur that adults need training in AI literacy – ‘a set of competencies that enables individuals to critically evaluate AI technologies, communicate and collaborate effectively with AI, and use AI as a tool online, at home, and in the workplace’ (Long & Magerko, Citation2020, p.2). AI literacy addresses non-experts interacting with AI technologies daily for different reasons (Laupichler et al., Citation2022).

For some, AI literacy encompasses four components – know and understand, use and apply, evaluate and create, and ethics – that allow people to move from a simple acquisition to creating knowledge through AI (Ng et al., Citation2021), while others stress that cooperation and creativity are necessary prerequisites to this scope, hence they critique AI literacy as focuses only on knowledge or attitudes directly related to AI and speak instead of the need to develop AI capabilities (Markauskaite et al., Citation2022). Along this line, four sets of AI capabilities – technology-related, work-related, human-machine-related, and learning-related – have been identified in the literature concerned, for instance, with the AI capabilities of employees at digital workplaces (Cetindamar et al., Citation2024). Along this line, some argue for the need to reframe lifelong learning through the capability approach to better integrate technology within lifelong learning (Poquet & De Laat, Citation2021). Meanwhile, ethical dilemmas remain on whether to support or restrict AI and other data-oriented technologies in education and on ‘balancing human-provided learning and machine-assisted learning’ (Luan, Geczy, Lai, Gobert, Yang, Ogata, Balters, Guerra, Luan et al., Citation2020, The position formulation).

AI technologies and research

Researchers are also using AI technologies, for instance, to summarise research findings, transcribe interview recordings, write papers, etc. Accordingly, it is also important to consider the potential and limitations of AI technologies for adult education research. Although the literature is still spare on these matters, Rice et al. (Citation2024) have recently suggested how ChatGPT could help in literature reviews, research methods and designs, and collaboration and communication, among other aspects involved in the research process. On the good side, the authors claim that ChatGPT can help identify relevant literature, assist researchers in evaluating such sources, extract, synthesise and summarise information from those sources and thus help identify research gaps, generate hypotheses, and aid researchers in developing well-defined questions to guide further research. Moreover, ChatGPT can suggest data collection techniques and sampling strategies by considering research objectives, constraints, and ethical considerations. On the negative side, however, is not only the limited database that ChatGPT relies upon but the need for researchers to check and complement such aid by cross-referencing multiple sources, seeking (human) expert opinions for validating the information, fact-checking and remaining engaged with scholarly communities in the research area of interest. Others have also considered ways ChatGPT can aid in crafting different parts of a research grant – e.g. aims, hypothesis, significance of the proposed project (Najafali et al., Citation2023) or to produce an abstract (Babl & Babl, Citation2023; Leong, Citation2023). While some scholars suggest declaring AI technologies that have been used in the writing of a piece (see, for instance, Gimpel et al., Citation2023), there are concerns that AI technologies might compromise academic integrity, originality, and validity (see, for instance, Rahimi & Talebi Bezmin Abadi, Citation2023), concerns that we share, for instance, when publishers – e.g. IGI Global – also promote using AI instead of experts to peer-review scholarly work!

Summing up, the literature suggests that there are areas in which chatbots and generative AI technologies like ChatGPT may be helpful tools to support teaching and learning in adult education and adult education research. However, the literature also points to the need to carefully consider the potential and limitations of these tools to avoid limiting or distorting the learning process or jeopardising academic integrity. At the same time, the literature also raises ethical concerns regarding data privacy, security, and cheating by learners or researchers, to which we add our ethical concerns about intellectual property – e.g. once we input research results or ideas into a generative chatbot, we then lose control over its use. However, most of the research (available in English) that has been reviewed extensively in recent years deals primarily with higher education, to some extent with secondary or even primary education, and at times does not specify the level of education. Accordingly, there is a need for further, dedicated research on the potential and limits of (generative) AI technologies in adult and professional education and in research. There is an equal need for systematic reviews of the specialised literature in languages other than English. This also calls attention to the urgency to further investigate the knowledge and know-how adult educators and researchers need to acquire for the effective and ethical use of AI technologies in their professions and the corresponding knowledge and know-how educators can help adults acquire in different learning environments and professional contexts, including in training junior and senior researchers. Finally, adult educators also need to engage in research-based curriculum design and development in AI literacy and capabilities. University researchers and established professional organisations representing adult education providers and professionals may support this as a collective endeavour through different forms of cooperation and collaboration at national and international levels. We invite our readership to publish and advance research on the above matters.

***

While writing this editorial, we learned with great sadness and share here with our readership that Professor Srabani Maitra died on Sunday, 10th December 2023. Srabani was a valued colleague and friend to some of the past and current editors. She was a Professor in the Sociology of Adult and Vocational Education at the University of Glasgow. She had served the International Journal of Lifelong Education as a referee on transnational migration, vocational education, and education policy for several years until she enthusiastically joined the Editorial Advisory Board only last September. The editorial team mourns her loss.

Acknowledgments

In writing this piece, we used Grammarly (1.49.2.0) to improve the linguistic presentation of our thoughts, yet we took full responsibility for the content.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • Atlas, S. (2023). ChatGPT for higher education and professional development: A guide to conversational AI. Retrieved from https://digitalcommons.uri.edu/cba_facpubs/548/. Last accessed 18-01-2024.
  • Babl, F. E., & Babl, M. P. (2023). Generative artificial intelligence: Can ChatGPT write a quality abstract? Emergency Medicine Australasia, 35(5), 809–811. https://doi.org/10.1111/1742-6723.14233
  • Cetindamar, D., Kitto, K., Wu, M., Zhang, Y., Abedin, B., & Knight, S. (2024). Explicating AI literacy of employees at digital workplaces. IEEE Transactions on Engineering Management, 71, 810–823. https://doi.org/10.1109/TEM.2021.3138503
  • Chiu, T., Xia, Q., Zhou, X., Chai, C. S., & Cheng, M. (2023). Systematic literature review on opportunities, challenges, and future research recommendations of artificial intelligence in education. Computers and Education: Artificial Intelligence, 4, 100118. https://doi.org/10.1016/j.caeai.2022.100118
  • Gimpel, H., Hall, K., Decker, S., Eymann, T., Lämmermann, L., Mädche, A., Röglinger, R., Ruiner, C., Schoch, M., Schoop, M., Urbach, N., & Vandirk, S. (2023, March 20). Unlocking the power of generative AI models and systems such as GPT-4 and ChatGPT for higher education: A guide for students and lecturers. University of Hohenheim.
  • Holford, J., Milana, M., Waller, R. W. S., & Hodge, S. (2019). Data, artificial intelligence and policy-making: Hubris, hype and hope. International Journal of Lifelong Education, 38(6), iii–vii. https://doi.org/10.1080/02601370.2020.1715685
  • Labadze, L., Grigolia, M., & Machaidze, L. (2023). Role of AI chatbots in education: Systematic literature review. International Journal of Educational Technology in Higher Education, 20, 56. https://doi.org/10.1186/s41239-023-00426-1
  • Laupichler, M. C., Aster, A., Schirch, J., & Raupach, T. (2022). Artificial intelligence literacy in higher and adult education: A scoping literature review. Computers and Education: Artificial Intelligence, 3, 100101. https://doi.org/10.1016/j.caeai.2022.100101
  • Leong, A. P. (2023). Clause complexing in research-article abstracts: Comparing human- and AI-generated texts. ExELL (Explorations in English Language and Linguistics), 11(2), 99–132. https://doi.org/10.2478/exell-2023-0008
  • Long, D., & Magerko, B. (2020). What is AI literacy? Competencies and design considerations. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI ‘20), New York, NY, USA (pp. 1–16). Association for Computing Machinery. https://doi.org/10.1145/3313831.3376727
  • Luan, H., Geczy, P., Lai, H., Gobert, J., Yang, S. J., Ogata, H., Balters, J., Guerra, R., Ping, L., & Tsai, C. C. (2020). Challenges and future directions of big data and artificial intelligence in education. Frontiers in Psychology, 11, 580820. https://doi.org/10.3389/fpsyg.2020.580820
  • Markauskaite, L., Marrone, R., Poquet, O., Knight, S., Martinez-Maldonado, R., Howard, S., Tondeur, J., de Laat, M., Buckingham Shum, S., Gašević, D., & Siemens, G. (2022). Rethinking the entwinement between artificial intelligence and human learning: What capabilities do learners need for a world with AI? Computers and Education: Artificial Intelligence, 3, 100056. https://doi.org/10.1016/j.caeai.2022.100056
  • McCarthy, J., Minsky, M., Rochester, N., & Shannon, C. (1955). A proposal for dartmouth summer research project on artificial intelligence. AI Magazine, 27(4), 12–14.
  • Mollick, E. R., & Mollick, L. (2022, December 13). New modes of learning enabled by AI chatbots: Three methods and assignments. SSRN Electronic Journal. Retrieved from Last accessed 12-02-2024. https://doi.org/10.2139/ssrn.4300783
  • Mouta, A., Pinto-Llorente, A. M., & Torrecilla-Sánchez, E. M. (2023). Uncovering blind spots in education ethics: Insights from a systematic literature review on artificial intelligence in education. International Journal of Artificial Intelligence in Education, 1–40. https://doi.org/10.1007/s40593-023-00384-9
  • Najafali, D., Hinson, C., Camacho, J. M., Logan, G. G., Gupta, R., & Reid, C. M. (2023). Can chatbots assist with grant writing in plastic surgery? Utilizing ChatGPT to start an R01 grant. Aesthetic Surgery Journal, 43(8), NP663–NP665. https://doi.org/10.1093/asj/sjad116
  • Ng, D. T. K., Leung, J. K. L., Chu, S. K. W., & Qiao, M. S. (2021). Conceptualising AI literacy: An exploratory review. Computers and Education: Artificial Intelligence, 2, 100041. https://doi.org/10.1016/j.caeai.2021.100041
  • Poquet, O., & De Laat, M. (2021). Developing capabilities: Lifelong learning in the age of AI. British Journal of Educational Technology, 52(4), 1695–1708. https://doi.org/10.1111/bjet.13123
  • Rahimi, F., & Talebi Bezmin Abadi, A. (2023). Passive contribution of ChatGPT to scientific papers. Annals of Biomedical Engineering, 51(11), 2340–2350. https://doi.org/10.1007/s10439-023-03260-8
  • Rawas, S. (2023). ChatGPT: Empowering lifelong learning in the digital age of higher education. Education and Information Technologies, 1–14. https://doi.org/10.1007/s10639-023-12114-8
  • Rice, S., Crouse, S. R., Winter, S. R., & Rice, C. (2024). The advantages and limitations of using ChatGPT to enhance technological research. Technology in Society, 76, 102426. https://doi.org/10.1016/j.techsoc.2023.102426
  • Sanusi, I. T., Oyelere, S. S., Vartiainen, H., Suhonen, J., & Tukiainen, M. (2023). A systematic review of teaching and learning machine learning in K-12 education. Education and Information, 28(5), 5967–599. https://doi.org/10.1007/s10639-022-11416-7
  • Tlili, A., Shehata, B., Adarkwah, M. A., Bozkurt, A., Hickey, D. T., Huang, R., & Agyemang, B. (2023). What if the devil is my guardian angel: ChatGPT as a case study of using chatbots in education. Smart Learning Environments, 15(23), 1–24. https://doi.org/10.1186/s40561-023-00237-x
  • Zirar, A. (2023). Exploring the impact of language models, such as ChatGPT, on student learning and assessment. Review of Education, 11(3), e3433. https://doi.org/10.1002/rev3.3433

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.