357
Views
4
CrossRef citations to date
0
Altmetric
Computers and computing

AR Computer-Assisted Learning for Children with ASD based on Hand Gesture and Voice Interaction

ORCID Icon, , , , &
 

Abstract

For children with autism spectrum disorder (ASD), innovative computer-assisted solutions can provide useful teaching, learning, and support tools. Speech, video, narrative, and a mix of 2D/3D images of virtual objects shown on screens and mobile phones can help to improve the learning experience. Augmented Reality (AR) presents a valuable technology for achieving certain educational objectives. This study presents a gesture and voice-based learning framework for children with ASD that offers interactivity, engagement, and visual support during school/cognitive therapy sessions. The proposed prototype was designed to increase children's engagement and focus during the session and achieve a high level of social interaction and inclusion. It was implemented in accordance with the requirements of specialists from regional rehabilitation centres for children with ASD and structured activities and AR-based games were defined. A multi-baseline feasibility pilot study was carried out with 18 children, aged 2–12 years, diagnosed with ASD and with learning difficulties (concentration and language development problems, social interaction and inclusion), and 3 therapists/parents. Preliminary qualitative and quantitative results show significantly higher engagement and concentration time using the proposed AR-computer system compared to a non-computer condition. In addition to improved receptive vocabulary and social interaction for the children, the prototype reduces teacher workload while engaging the children.

DISCLOSURE STATEMENT

No potential conflict of interest was reported by the author(s).

Additional information

Notes on contributors

Kahina Amara

Kahina Amara obtained her PhD degree at USTHB, Algiers, Algeria, in 2018 and her master's in control and robotics at USTHB in 2011. She works as a permanent researcher at CDTA, Algeria with the IRVA team. Her main research includes: augmented and virtual reality, 3D interaction, affective computing, computer vision, emotion recognition, computer vision, healthcare. Corresponding author. Email: [email protected]

Chahrazed Boudjemila

Boudjemila Chahrazed obtained her master’s degree in 2017 at USTHB, and she works on human–computer interaction. Email: [email protected]

Nadia Zenati

Nadia Zenati received her PhD degree in process control and robotics from the University of Franche-Comté, France, in 2008. Now, she works as a researcher at CDTA, Algeria. Email: [email protected]

Oualid Djekoune

Djekoune Oualid received his PhD degree in process control and robotics from the electronic institute of USTHB, Algiers, in 2010. Now, he works as a researcher at CDTA, Algeria. Email: [email protected]

Drifa Aklil

Drifa Aklil obtained her master’s degree in 2017 at USTHB, and she works on human–computer interaction. Email: [email protected]

Mouna Kenoui

Mouna Kenoui works as research engineer at CDTA, Algeria. Her main research includes augmented and virtual reality, interaction, cloud computing. Email: [email protected]

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.