1,810
Views
0
CrossRef citations to date
0
Altmetric
Reviews

Quantifying patient experiences with therapeutic neurorehabilitation technologies: a scoping review

, , ORCID Icon & ORCID Icon
Pages 1662-1672 | Received 02 Dec 2022, Accepted 06 Apr 2023, Published online: 03 May 2023

Abstract

Purpose

Neurorehabilitation technologies are a novel approach to providing rehabilitation for patients with neurological conditions. There is a need to explore patient experiences. This study aimed; 1) To identify available questionnaires that assess patients’ experiences with neurorehabilitation technologies, and 2) where reported, to document the psychometric properties of the identified questionnaires.

Materials and Methods

Four databases were searched (Medline, Embase, Emcare and PsycInfo). The inclusion criteria were all types of primary data collection that included neurological patients of all ages who had experienced therapy with neurorehabilitation technologies and completed questionnaires to assess these experiences.

Results

Eighty-eight publications were included. Fifteen different questionnaires along with many self-developed scales were identified. These were categorised as; 1) self-developed tools, 2) specific questionnaire for a particular technology, and 3) generic questionnaires originally developed for a different purpose. The questionnaires were used to assess various technologies, including virtual reality, robotics, and gaming systems. Most studies did not report any psychometric properties.

Conclusion

Many tools have been used to evaluate patient experiences, but few were specifically developed for neurorehabilitation technologies and psychometric data was limited. A preliminary recommendation would be use of the User Satisfaction Evaluation Questionnaire to evaluate patient experience with virtual reality systems.

IMPLICATIONS FOR REHABILITATION

  • Fifteen unique tools evaluating patient experiences with neurorehabilitation technology were identified

  • The User Satisfaction Evaluation and ArmAssist Usability Assessment were designed specifically for therapeutic neurorehabilitation technology

  • For all identified tools, psychometric data were poorly reported or not available

  • A preliminary recommendation is to use the User Satisfaction Evaluation Questionnaire for evaluating virtual reality systems

Introduction

Neurological conditions are the leading cause of disability and the second leading cause of death globally [Citation1]. Fortunately, survival prospects for patients with neurological disorders have increased due to advances in treatment provision and acute therapies [Citation2]. Despite improvement in medical management, patients often have persisting impairments, resulting in limitations with many occupations, including activities of daily living, leisure activities, work, and recreation [Citation2]. As populations steadily increase, health service providers will likely face an increasing demand for rehabilitation services [Citation3].

Neurorehabilitation technologies, defined as therapeutic technology to deliver or enhance rehabilitation, are emerging as a novel, engaging and potentially beneficial approach to providing rehabilitation [Citation4]. The specific therapeutic technologies continue to evolve, but can include robotics, non-invasive brain stimulation, virtual reality, augmented reality, gaming devices, and bodyweight support treadmill training [Citation5–8]. Therapeutic benefits of neurorehabilitation technologies appear to be underpinned by ability to deliver increased repetitions, duration and intensity of training, reduce physical demands for the therapists, improved patient safety and better cost efficiency [Citation9]. Evidence indicates that neurorehabilitation technologies are likely to be as effective, or superior, compared to traditional forms of therapy in supporting recovery [Citation5]. As such, technology is likely to play an increasing role in neurorehabilitation.

Despite potential therapeutic benefits, it remains important to consider the patient’s perspective when introducing novel therapies. Client-centred practice is vital to support a therapeutic process that is suited to each patient by being responsive to each individual’s values, needs, and preferences [Citation10]. Evaluating patients’ experiences with neurorehabilitation technologies using questionnaires and surveys is one way to gain insight to the patient experience with these emerging therapies [Citation11]. A previous systematic review provided evidence that there were several questionnaires available to assess patient motivation and satisfaction during technology-assisted rehabilitation [Citation12]. Nine different questionnaires were identified; however, their search was limited to robotics, virtual reality, and serious games, and did not include other types of neurorehabilitation technology [Citation12]. Furthermore, it was not reported whether the identified tools had psychometric properties, which could prove important for selecting appropriate measures, or alternatively, may guide future areas of work to evaluate the reliability and validity of the identified questionnaires. Further work is required to collate all available measures to assess patients’ experiences for a wide range of neurorehabilitation technologies and their psychometric properties.

The primary aim of this study was to identify all available questionnaires that assess patients’ experiences with neurorehabilitation technologies. The secondary aim was to extract the psychometric data of the questionnaires where reported. With the increasing use of neurorehabilitation technologies and the continued development of new technology-based therapies, there is value in a comprehensive summary of questionnaires, and their psychometric data, to understand patient experiences and perspectives.

Methods

Design

A scoping review was selected to address the aims of this study. By definition, a scoping review maps the available body of literature on a topic area [Citation13]. It assesses the scope and size of the available literature to identify the extent, quantity and type of research on a given topic [Citation13]. This scoping review was conducted and reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analyses Extension for Scoping Reviews (PRISMA-ScR) [Citation14]. This approach included: 1) defining the research objective; 2) developing a protocol; 3) searching for relevant studies; 4) screening relevant studies; 5) extracting data; 6) summarising and reporting the results [Citation14]. The final protocol was registered with Open Science Framework on 28 January 2022 (https://osf.io/8ay5j/).

Inclusion & exclusion criteria

The review was guided by the population, concept, context framework. The population for this review included people with a neurological condition who have used therapeutic neurorehabilitation technologies. Neurological pathologies included, but were not limited to, stroke, traumatic brain injury, spinal cord injury, Parkinson’s Disease, or multiple sclerosis. There were no other limits on the population regarding age, sex, impairment, or chronicity, as the aim was to seek as much information as possible. The overarching concept was patients’ experiences with neurorehabilitation technologies. Neurorehabilitation technology refers to therapeutic-based technology used to help patients recover. This equipment included upper and lower limb robotics (including resistance-based, accelerometer-based or dynamic systems), exoskeletons, non-invasive brain stimulation, virtual reality, augmented reality, gaming devices, or bodyweight support treadmill training. Assistive technologies such as text-to-speech systems or wheelchairs that were compensatory rather than therapeutic were not included. A secondary element of interest included the reported validity and reliability of the questionnaires, where available. The context for this review was specifically using questionnaires or surveys. Qualitative studies that included interviews or focus groups to assess patients’ experiences were excluded.

Search strategy

Identification of studies was achieved by searching Medline, Embase, Emcare, and PsycInfo databases. The search strategy was developed with the support of an academic librarian and carried out on 28 April 2022. An example search for Medline is provided in Supplementary Appendix 1, and this strategy was adapted for each database. There was no limit set on time or language. Search results were exported into EndNote and Covidence, and duplicates were removed respectively by both software.

Study selection

The screening process was carried out in Covidence independently by two reviewers (CMN and BH), with discrepancies were resolved by discussion, or in consultation with a third reviewer (JU). The first step involved reviewers eliminating irrelevant articles by examining their titles and abstracts. The second step involved reviewers selecting and eliminating articles based on full-text examination. Details of articles included in the review and reasons why articles were excluded were stated by each reviewer.

Data extraction

Two reviewers (CMN and BH) jointly developed a data extraction table to determine the critical pieces of information to chart from each article. One reviewer extracted the data, with a second reviewer checking and confirming the extractions. Discrepancies were resolved by discussion. Data extracted included the author/s, year of publication, study aim, population, intervention type and details, type of questionnaire used, and psychometric data including reliability and validity. Key themes were summarised in a tabular format to integrate and synthesise the information found. Key information that was synthesised included the number of studies that used a specific type of questionnaire, the neurorehabilitation technology for which different questionnaires were used, the psychometric data available for each questionnaire, and the number of different neurological conditions. A quality appraisal was not conducted, as scoping reviews aim to show an overview of existing evidence regardless of the quality of the evidence [Citation14].

Results

Search results

The search identified 2919 articles, which reduced to 2163 once duplicates were removed. After undergoing the two screening processes, 88 studies were included in the final review ().

Figure 1. PRISMA flow chart of search results and included studies.

Summary of articles identified through database searching, screening for eligibility, reasons for exclusion and final number of articles included in the review. The search identified 2919 articles and 88 were included in the review.
Figure 1. PRISMA flow chart of search results and included studies.

Description of included studies

The 88 included studies were all published between 2003 and 2022. A broad range of neurological conditions were identified, including people with stroke (n = 50), spinal cord injury (n = 11), multiple sclerosis (n = 11), and cerebral palsy (n = 6). Most studies were an observational design (n = 66), followed by randomised controlled trials (n = 15), case studies (n = 3), non-randomised controlled trials (n = 3), and a pseudo-randomised controlled trial (n = 1). A range of neurorehabilitation technologies were used, including virtual reality (n = 40), robotics (n = 20), exoskeletons (n = 14), video games (n = 11), treadmill training (n = 1), accelerometers and surface electromyography used as a game controller (n = 1), and transcranial direct current stimulation (n = 1). Fifteen different questionnaires were identified, along with many self-developed tools based on likert scales, visual analogue scales, or numeric rating scales, to assess patient experiences with neurorehabilitation technologies. These questionnaires are presented in three categories; 1) Self-developed measures, 2) Tools designed specifically for neurorehabilitation technology, and 3) Generic assessments that were used to assess patient experiences with neurorehabilitation technology but were developed for a different purpose. A summary of the identified tools, the neurological populations which have used each tool and types of neurorehabilitation technology are provided in . Reported psychometric properties for the identified tools are summarised in .

Table 1. Summary of questionnaires identified to evaluate patient experiences with neurorehabilitation technologies.

Table 2. Summary of psychometric data for identified tools to evaluate patient experiences with neurorehabilitation technologies.

Self-developed questionnaires

Fifty studies reported use of self-developed assessments to quantify patient experiences. Often these questionnaires were adapted to suit the technology and population of interest. Twenty studies chose to use Likert scales [Citation15–26,Citation39–42,52–55, nine used visual analogue scales [Citation27–31,Citation46,Citation56,Citation57,Citation62], and two used numeric rating scales to assess patient experiences [Citation47,Citation48]. Nineteen of the 50 studies that used self-developed assessments did not detail the features of these questionnaires or how responses were quantified [Citation32–38,Citation43–45,Citation49–51,Citation58–61,Citation63,Citation64].

For self-developed tools that were well described, some clearly reported the questionnaire features. For example, Aguilera-Rubio, et al. [Citation15] created a Likert scale questionnaire comprising nine items that assessed experiences of the usefulness of their virtual reality platform, the degree of patients’ motivation, technical problems encountered by patients, pain levels, and the importance of therapists’ support during the intervention. Each item was scored from one to four, with a maximum score of 36. All questions were directly proportional, meaning the higher the score, the better the patient’s perception of the intervention. An example of a visual analogue scale can be seen in Gagnon, et al. [Citation57], where the self-developed questionnaire contained 41 statements pertaining to seven key areas: overall satisfaction with the robotic exoskeleton training program, satisfaction related to the exoskeleton itself, patients’ perceived learnability, patients’ satisfaction relating to the program, and patients’ perceived health benefits gained from the program. To rate each statement, patients positioned an electronic cursor on a 100 mm visual analogue scale [Citation57]. Many studies produced questionnaires with similar features to Gagnon, et al. [Citation57] and Aguilera-Rubio, et al. [Citation15], such as using Likert scales to assess satisfaction, motivation, and experience [Citation39,Citation45,Citation52,Citation54–56]. As these questionnaires were self-developed, most have not been tested for reliability and validity. Only Gagnon, et al. [Citation57] reported that the content validity of their questionnaire was considered acceptable since it was developed following a review of the literature, along with consultations with therapists and feedback provided by individuals in their target population group.

Specifically developed tools for technology in neurorehabilitation

The User satisfaction evaluation questionnaire (USEQ)

Four studies used the USEQ, a questionnaire specifically designed to evaluate patient satisfaction with virtual reality systems used for rehabilitation [Citation11,Citation65–67]. The original study that created and introduced the USEQ was also identified during the search [Citation17]. This study composed a set of questions evaluating satisfaction derived from the Suitability Evaluation Questionnaire [Citation17]. The USEQ has six questions on a five-point Likert scale, with the total score ranging from six (indicating poor satisfaction) to 30 (indicating excellent satisfaction) [Citation17]. The USEQ has a Cronbach’s alpha of 0.716, indicating adequate internal consistency [Citation17].

The ArmAssist Usability Assessment Questionnaire

One study used this questionnaire and reported that it consisted of a 17-item survey designed for the ArmAssist device used in the study’s robotic system [Citation68]. Patients rated the questions from ‘one’, meaning ‘strongly agree’ to ‘seven’, meaning ‘strongly disagree’ to evaluate satisfaction with the system [Citation68]. The questionnaire also included three open-ended questions about the patient’s subjective opinion, such as aspects of the system they most and least liked, and any ideas for improving the system [Citation68]. No psychometric properties were reported in this study [Citation68].

Generic assessments

The system usability scale

The SUS was originally developed to assess usability of electronic office systems but was used by 14 studies to evaluate patients’ experiences with VR, exoskeletons, robotics, and gaming technologies [Citation68–81]. One study reported that the SUS consisted of ten statements, such as “I thought the system was easy to use”, and patients rated their level of agreement from “strongly agree” to “strongly disagree” on a five-point Likert scale [Citation73]. The total score ranged from 0 to 100 points, with higher scores equating to greater usability [Citation73]. No other study provided further information on the SUS, as evaluating patients’ experiences was a minor aim. None of the included studies reported on the reliability and validity of this questionnaire.

The Quebec User evaluation of satisfaction with assistive technology (QUEST)

The QUEST is composed of 27 items that evaluate satisfaction with assistive devices and related services [Citation83]. Six studies used the QUEST for evaluating patient experiences with therapeutic neurorehabilitation technologies, with one study only using the assistive device subscale of the questionnaire [Citation68,Citation80,Citation82–85]. The therapeutic technologies included robotics, exoskeletons, and virtual reality [Citation68,Citation80,Citation82–85]. None of the included studies reported the psychometric properties of the QUEST.

The psychosocial impact of assistive devices scale (PIADS)

The PIADS was also originally developed for assistive devices rather than therapeutic rehabilitation technology. It measures user perception and psychological factors associated with device use. One study used the PIADS to evaluate patient experiences with a VR system [Citation80]. The authors described the PIADS as a self-reported, 26-item questionnaire that assessed the effects of an assistive device on independence, well-being, and quality of life. The tool is reported to have good internal consistency, test-retest reliability, and construct validity [Citation80].

The short feedback questionnaire (SFQ)

The SFQ was originally developed to evaluate virtual environment tasks by quantifying the sense of presence, perceived difficulty and discomfort. Although not originally designed for health and rehabilitation systems, four virtual reality studies and one robotic study used the SFQ to evaluate patient expereinces with rehabilitation technology [Citation79,Citation86–89]. Demers, et al. [Citation86] reported that the SFQ assessed success, control, enjoyment, realism, immersion, and understandability of computer feedback on a five-point Likert scale, with a rating of ‘one’ meaning “not at all” and ‘five’ meaning “to a great extent”, for a total score of 30 points. Two studies adapted the SFQ and prepared their own questionnaire to fit their virtual reality or robotic system [Citation87,Citation89]. None of the studies reported on the psychometric properties of the SFQ.

The intrinsic motivation inventory (IMI)

The IMI was developed to evaluate subjective experiences and motivation for targeted activities. Four studies used the IMI and reported on features of this questionnaire [Citation68,Citation90–92]. The IMI consisted of six subscales and measured interest/enjoyment, perceived competence, effort, pressure/tension, value and relatedness, which are rated on seven-point Likert scales from ‘one’ being “not at all true” to ‘seven’ being “very true”. None of the included studies reported the psychometric properties of this questionnaire.

The technology acceptance model (TAM) tool

The TAM tool is a generic questionnaire that measures user acceptance and usage of technological devices regarding patients’ perceived ease of use, perceived usefulness, and attitude toward using the technology. Four studies used the TAM tool [Citation93–96] and reported that it contained 24 items rated on a seven-point Likert scale, where a score of ‘one’ means “I do not agree at all” and a score of ‘seven’ means “I agree entirely”. No study reported the questionnaire’s psychometric properties.

The usefulness, satisfaction, and ease of use questionnaire

The usefulness, satisfaction, and ease of use questionnaire measures subjective usability of a product or service. One study used this questionnaire for a virtual reality system and did not report any information or psychometric properties about it, as evaluating experiences was a minor aim [Citation97].

The tele-healthcare satisfaction questionnaire

The tele-healthcare satisfaction questionnaire measures patient satisfaction with telemedicine, but was used by one study to evaluate a gaming system [Citation98]. Authors of that study reported that that the tele-healthcare satisfaction questionnaire measured six different areas, including patients’ perceived benefit, usability, self-concept, privacy and loss of control, quality of life, and wearing comfort [Citation98]. The questionnaire included five statements rated by patients on a five-point Likert scale between ‘zero’, meaning “strongly disagree with the statement”, and ‘four’, meaning “strongly agree with the statement” [Citation98]. This study only administered the benefit, usability, and wearing comfort components of the questionnaire and did not report on psychometric properties [Citation98].

The Task-Specific feedback questionnaire

One study used this questionnaire for a virtual reality system and reported that it was composed of six questions that patients could respond to on a Likert scale from one to five, giving a total score between one and 30 [Citation99]. No information on psychometric properties was reported [Citation99].

NASA Task Load Index

The NASA task load index was originally developed to obtain workload estimates for operators working with various human-machine interfaces. Two studies used this questionnaire and reported that it tested six domains of patients’ self-perception of tool usability [Citation65,Citation78]. The studies that used the NASA Task Load Index did not report on psychometric properties.

The questionnaire for User interface satisfaction

Originally developed to evaluate user acceptance of computer interfaces, this questionnaire was adapted by one study to evaluate a virtual reality system [Citation100]. The study reported that this questionnaire had Cronbach’s alpha of 0.94, which indicates excellent internal consistency [Citation100]. No other information on the features of this questionnaire was reported.

The game experience questionnaire (GEQ)

The GEQ captures the game and playing experience. Two studies used or adapted the GEQ to develop evaluate patients’ experiences with a gaming system [Citation77,Citation101]. The first one, referred to as the “In-Game questionnaire”, was filled in by patients during the intervention and was used to investigate patients’ perceived competence, patients’ immersion in the game, the game’s flow, and the negative and positive effects of the game on patients [Citation101]. The second questionnaire in the study, referred to as the “Post-Game questionnaire”, was filled in at the end of the intervention and was composed of four groups of questions aimed to evaluate patients’ “return back to reality”, patients’ energy levels at the end of the game, and negative or positive experiences [Citation101]. Patients were asked to respond to the questions in both questionnaires on a five-point Likert scale. No psychometric properties were reported in this study.

Client satisfaction questionnaire – 8 (CSQ-8)

The CSQ-8 was originally developed to quantify satisfaction with delivery of services. It comprises 8-items, scored from on a scale from one to four, with higher scores indicating greater satisfaction. One study adopted the CSQ-8 to evaluate satisfaction with a virtual reality system [Citation92], but did not provide any psychometric properties of the tool.

Discussion

This scoping review identified 88 studies that used different questionnaires to assess patients’ experiences with therapeutic neurorehabilitation technologies. Patient experiences were evaluated for a wide range of technologies, including virtual reality, robotic systems, and computer-based games. However, few studies provided psychometric properties of the tools, raising questions about the validity and reliability of these measures. Together, this review provides a comprehensive summary of available tools to evaluate patient experiences with neurorehabilitation technologies and identifies the need to conduct further work to evaluate psychometric properties.

This review identified that self-developed questionnaires were most common, as 50 of the 88 studies included developed their own tool for evaluation. Likert scales were frequently used. An advantage of using Likert scales is that they are universally used in different fields, making them easy to understand [Citation102]. Furthermore, as it is a scale, patients can express their opinions more accurately as opposed to choosing an “either-or” option [Citation102]. However, the disadvantages are low test-retest reliability, statistical limitations, and they can be time consuming to complete [Citation103]. As Likert scales are easily customisable, researchers can specifically curate questionnaires to evaluate patients’ experiences with their particular technology. However, this makes it difficult to compare questionnaires across different studies. It also means that the questionnaires are not often proven to be reliable and valid and therefore may be biased. The review also did not find any information on whether these questionnaires accurately measure patients’ experiences and whether they are an adequate evaluation method.

Our findings identified that several general questionnaires aimed at evaluating technological systems were used, including the SUS, TAM tool and SFQ [Citation73,Citation86,Citation93], with the SUS being the most common questionnaire used. The SUS also uses the Likert scale, as it consists of ten statements rated on a 5-point scale [Citation104]. Some examples of these statements include: “I found the system unnecessarily complex”, “I thought the system was easy to use”, “I felt very confident using the system”, and “I needed to learn a lot of things before I could get going with this system” [Citation104]. Evidently, these statements apply to neurorehabilitation technology in terms of the technological aspect. However, none of the statements capture patients’ experiences with the therapeutic aspect, such as whether the technology is beneficial for their rehabilitation or not. This is because the SUS was originally developed for assessing the usability of electronic office items, not neurorehabilitation technologies [Citation93].

Contrastingly, our review found two questionnaires specifically designed to evaluate patients’ experiences with therapeutic neurorehabilitation technologies. These questionnaires were the USEQ [Citation59], designed to evaluate experience with VR, and the ArmAssist Usability Assessment Questionnaire [Citation68], designed to evaluate a specific robotic system. The USEQ was developed in 2017 and has been used in subsequent work [Citation65,Citation66]. Like the SUS, the USEQ has six statements that patients can rate on a 5-point scale [Citation11]. Statements in the USEQ also specifically capture the therapeutic aspect of the technology, such as “do you think that this system will be helpful for your rehabilitation”, “did you feel discomfort during your experience with the system” and “were you able to control the system” [Citation11]. However, it should be acknowledged that there may be potential bias in response to these questions. For example, responses may be overly positive by those fearful that a negative response might lead to loss of access to the technology being trialled. This could be mitigated by clarifying that responses have no bearing on future access to the technology. Similar to the USEQ, the ArmAssist Usability Assessment Questionnaire is based on Likert scales (7-point scale), but consists of 15 items to evaluate ease of use, comfort, pain, fatigue, enjoyment, benefits, desire for further use, possible difficulties and recommendations. However, psychometric properties do not appear to have been reported for this tool. Given the USEQ was designed specifically for therapeutic VR systems, and that the USEQ has adequate internal consistency [Citation11], a preliminary observation would be that this tool is appropriate for use in evaluating patient experiences with therapeutic VR systems, while acknowledging and mitigating potential bias of overly positive responses. The development of the USEQ demonstrates that questionnaires which capture both technological and therapeutic aspects of experiences with neurorehabilitation technologies can be developed. Since client-centred practice is important in rehabilitation, it is worth questionnaires considering the therapeutic factors of the technology, not just its generic usability. Therefore, this review proposes that there is a need to develop new tools specific to neurorehabilitation technologies.

The vast majority of identified studies did not report on the psychometric properties of the questionnaires. However, this of course does not mean that psychometric evaluation has not been performed. For example, the SUS is reported to have a Cronbach’s alpha of 0.92 [Citation104], indicating excellent internal consistency. Furthermore, it is reported to have strong face validity [Citation104], content validity and concurrent validity [Citation105]. It was also found to have fair agreement between raters when evaluating a medical software platform [Citation106]. The QUEST was found to have good internal consistency (Cronbach alpha 0.82) and test-retest reliably when evaluated with adults who have multiple sclerosis and used assistive technology [Citation107,Citation108]. The QUEST was also developed with subject matter experts (face validity), with content validity evaluated by 12 experts [Citation108,Citation109]. A systematic review on psychometric properties of the PIADS identified high content validity, moderate criterion validity, and concurrent validity reported across a range of studies [Citation110]. Further studies also reported excellent internal consistency and face validity for the Usefulness, Satisfaction, and Ease of Use Questionnaire [Citation111], good internal consistency and construct validity for the telehealth satisfaction questionnaire [Citation112], and good internal consistency and test-retest reliability in older adults for the NASA Task-Load Index [Citation113,Citation114]. Finally, the CSQ-8 was reported to have excellent internal consistency when tested in a community mental health centre (Cronbach alpha 0.92) [Citation115], and internal consistency with a measure for treatment perceptions evaluated in patients with substance abuse [Citation116]. Although these studies were not identified in the search as they did not include neurorehabilitation technology, or were not conducted in a neurological population, there is value in recognising psychometric evaluations performed on these tools. A key outcome of this review is that there are a range of measures available, but few have psychometric properties for patients with a neurological condition who have used neurorehabilitation technologies.

Future directions

A variety of questionnaires were identified in this review, but only two were explicitly made for therapeutic neurorehabilitation technology. As such, there exists a need to develop new reliable and validated questionnaires designed specifically for neurorehabilitation technologies, as generic tools such as the SUS do not cover therapeutic aspects of the technology. We have also identified a need for further research on the use of generic and self-developed questionnaires for patient evaluation. It is critical that the psychometric properties of these types of questionnaires are established to provide evidence that they can truly assess patient experiences within the context of using neurorehabilitation technologies. This is important for the assessment of patient outcomes and treatment planning of the patient’s rehabilitation. Finally, in developing and testing future questionnaires designed specifically for neurorehabilitation technologies, there is need to consider cognitive capacity of potential participants, ensuring questions are free of technical terminology for accurate assessment.

Limitations

Our review was limited to papers published only in English and did not include grey literature. Furthermore, the search was completed in Medline, Emcare, Embase and PsycInfo, and thus studies in other databases may not have been found through the search. Moreover, we only extracted psychometric data if they were reported in the included studies. While we are aware of studies that do report psychometric data for these measures, these evaluations may have been performed in different clinical populations or settings that were not included in this review.

Conclusion

Our review identified a variety of questionnaires to evaluate patient experiences with neurorehabilitation technologies. These tools included self-developed measures, specific questionnaires made for therapeutic technologies and generic questionnaires originally developed for a different purpose. This review is a useful summary of available measures for neurorehabilitation technologies along with reported psychometric data. A preliminary recommendation from our review is to consider use of the USEQ for studies evaluating virtual reality systems as it was specifically developed for this therapeutic technology and has adequate internal consistency, but we suggest caution regarding potential bias for overly positive responses. This review advocates for more research into producing specific questionnaires for neurorehabilitation technologies along with further work to evaluate psychometric properties.

Supplemental material

Supplemental Material

Download MS Word (18.3 KB)

Disclosure statement

BH holds a paid consultancy role for Recovery VR and has a clinical partnership with Fourier Intelligence. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Additional information

Funding

The author(s) reported there is no funding associated with the work featured in this article.

References

  • Global Burden of Disease Neurology Collaborators Global, regional, and national burden of neurological disorders, 1990–2016: a systematic analysis for the global burden of disease study 2016. Lancet Neurol. 2019;18:459–480.
  • Playford D, Nair KPS. An introduction to neurological rehabilitation. In: Panicker JN, Nair KPS, González-Fernández M, editors. Neurorehabilitation therapy and therapeutics. Cambridge: Cambridge University Press; 2018. p 1–9.
  • Carroll WM. The global burden of neurological disorders. Lancet Neurol. 2019;18(5):418–419.
  • Sainburg RL, Mutha PK. Movement neuroscience foundations of neurorehabilitation. In: Reinkensmeyer DJ, Dietz V, editors. Neurorehabilitation technology. Cham: Springer International Publishing; 2016. p. 19–38.
  • Molinari M, Esquenazi A, Anastasi AA, et al. Rehabilitation technologies application in stroke and traumatic brain injury patients. In: Pons JL, Raya R, González J, editors. Emerging therapies in neurorehabilitation II. Cham: Springer International Publishing; 2016. p 29–64.
  • Marin-Pardo O, Laine CM, Rennie M, et al. A virtual reality Muscle-Computer interface for neurorehabilitation in chronic stroke: a pilot study. Sensors. 2020;20(13):3754.
  • Cho KH, Kim MK, Lee H-J, et al. Virtual reality training with cognitive load improves walking function in chronic stroke patients. Tohoku J Exp Med. 2015;236(4):273–280.
  • Esquenazi A, Lee S, Packel AT, et al. A randomized comparative study of manually assisted versus robotic-assisted body weight supported treadmill training in persons with a traumatic brain injury. Pm R. 2013;5(4):280–290.
  • Luft A, Bastian AJ, Dietz V. Learning in the damaged brain/spinal cord: neuroplasticity. In: reinkensmeyer DJ, Dietz V, editors. Neurorehabilitation technology. Cham: Springer International Publishing; 2016. p 3–17.
  • Enemark Larsen A, Rasmussen B, Christensen JR. Enhancing a Client-Centred practice with the Canadian occupational performance measure. Occup Ther Int. 2018;2018:5956301.
  • Gil-Gomez J-A, Manzano-Hernandez P, Albiol-Perez S, et al. USEQ: a short questionnaire for satisfaction evaluation of virtual rehabilitation systems. Sensors. 2017;17(7):1589.
  • Monardo G, Pavese C, Giorgi I, et al. Evaluation of patient motivation and satisfaction during Technology-Assisted rehabilitation: an experiential review. Games Health J. 2021;10(1):13–27.
  • Peters MDJ, Godfrey CM, Khalil H, et al. Guidance for conducting systematic scoping reviews. Int J Evid Based Healthc. 2015;13(3):141–146.
  • Tricco AC, Lillie E, Zarin W, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467–473.
  • Aguilera-Rubio A, Cuesta-Gomez A, Mallo-Lopez A, et al. Feasibility and efficacy of a virtual reality Game-Based upper extremity motor function rehabilitation therapy in patients with chronic stroke: a pilot study. IJERPH. 2022;19(6):3381.
  • Almenara M, Cempini M, Gomez C, et al. Usability test of a hand exoskeleton for activities of daily living: an example of user-centered design. Disabil Rehabil. 2017;12(1):84–96.
  • Choi Y-H, Ku J, Lim H, et al. Mobile game-based virtual reality rehabilitation program for upper limb dysfunction after ischemic stroke. Restor Neurol Neurosci. 2016;34(3):455–463.
  • Jyrakoski T, Merilampi S, Puustinen J, et al. Over-ground robotic lower limb exoskeleton in neurological gait rehabilitation: user experiences and effects on walking ability. TAD. 2021;33(1):53–63.
  • Lee KW, Kim SB, Lee JH, et al. Effect of robot-assisted game training on upper extremity function in stroke patients. Ann Rehabil Med. 2017;41(4):539–546.
  • Lee M, Son J, Kim J, et al. Comparison of individualized virtual reality- and group-based rehabilitation in older adults with chronic stroke in community settings: a pilot randomized controlled trial. Europ J Integr Med. 2016;8(5):738–746.
  • Lee SH, Jung H-Y, Yun SJ, et al. Upper extremity rehabilitation using fully immersive virtual reality games with a head mount display: a feasibility study. Pm R. 2020;12(3):257–262.
  • Piron L, Turolla A, Tonin P, et al. Satisfaction with care in post-stroke patients undergoing a telerehabilitation programme at home. J Telemed Telecare. 2008;14(5):257–260.
  • Seo NJ, Arun Kumar J, Hur P, et al. Usability evaluation of low-cost virtual reality hand and arm rehabilitation games. J Rehabil Res Dev. 2016;53(3):321–334.
  • Swank C, Sikka S, Driver S, et al. Feasibility of integrating robotic exoskeleton gait training in inpatient rehabilitation. Disabil Rehabil. 2020;15:409–417.
  • La Rosa G, Adans-Dester C, Fabara E, et al. A novel End-Effector system to enable Pro-Supination movements during Robot-Assisted Upper-Limb training. Arch Phys Med Rehabil. 2019;100(12):e165. others
  • Kikuchi T, Nagata T, Sato C, et al. Sensibility assessment for user interface and training program of an Upper-Limb rehabilitation robot, D-SEMUL. Annu Int Conf IEEE Eng Med Biol Soc. 2018;2018:3028–3031.
  • Bovolenta F, Goldoni M, Clerici P, et al. Robot therapy for functional recovery of the upper limbs: a pilot study on patients after stroke. J Rehabil Med. 2009;41(12):971–975.
  • Bovolenta F, Sale P, Dall’Armi V, et al. Robot-aided therapy for upper limbs in patients with stroke-related lesions. Brief report of a clinical experience. J Neuroeng Rehab. 2011;8:18.
  • Busching I, Sehle A, Sturner J, et al. Using an upper extremity exoskeleton for semi-autonomous exercise during inpatient neurological rehabilitation- a pilot study. J Neuroeng Rehab. 2018;15(1):72.
  • Huber SK, Held JPO, de Bruin ED, et al. Personalized motor-cognitive exergame training in chronic stroke patients-A feasibility study. Front Aging Neurosci. 2021;13:13.
  • Palma E, Rossini L, Del Percio C, et al. Others. Robot-aided therapy for upper limbs in patient with chronic stroke-related lesions. Brief report of a clinical experience. Clin Neurophysiol. 2011;122: s 177.
  • Brunner I, Skouen JS, Hofstad H, et al. Virtual reality training for upper extremity in subacute stroke (VIRTUES): study protocol for a randomized controlled multicenter trial. BMC Neurol. 2014;14(1):186.
  • Kang YJ, Ku J, Han K, et al. Development and clinical trial of virtual reality-based cognitive assessment in people with stroke: preliminary study. Cyberpsychol Behav. 2008;11(3):329–339.
  • Kim J, Lee M, Kim Y, et al. Feasibility of an individually tailored virtual reality program for improving upper motor functions and activities of daily living in chronic stroke survivors: a case series. Europ J Integr Med. 2016;8(5):731–737.
  • Parikh A, Legault C, Flavin K, et al. Neofect glove: virtual reality device for home therapy in stroke survivors. Neurology. 2018; 90(15 Supplement):P5.007.
  • Riggs A, Patel V, Paneri B, et al. At-Home transcranial direct current stimulation (tDCS) with telehealth support for symptom control in Chronically-Ill patients with multiple symptoms. Front Behav Neurosci. 2018;12:93.
  • Treger I, Faran S, Ring H. Robot-assisted therapy for neuromuscular training of Sub-acute stroke patients. A Feasibility Study. Europ J Phys Rehabil Med. 2008;44:431–435.
  • Turchetti G, Mazzoleni S, Dario P, et al. The impact of robotic technology on neuro-rehabilitation: preliminary results on acceptability and effectiveness. Value Health. 2015;18(7):A363–A364.
  • Gunn M, Shank TM, Eppes M, et al. User evaluation of a dynamic arm orthosis for people with neuromuscular disorders. IEEE Trans Neural Syst Rehabil Eng. 2016;24(12):1277–1283.
  • Li W, Lam-Damji S, Chau T, et al. The development of a home-based virtual reality therapy system to promote upper extremity movement for children with hemiplegic cerebral palsy. TAD. 2009;21(3):107–113.
  • Ling L, Xiang C, Zhiyuan L, et al. Development of an EMG-ACC-Based upper limb rehabilitation training system. IEEE Transac Neural Syst Rehabil Engineer. 2017;25:244–253.
  • Winkels DG, Kottink AI, Temmink RA, et al. Wii™-habilitation of upper extremity function in children with cerebral palsy. An explorative study. Dev Neurorehabil. 2013;16(1):44–51.
  • Al-Amri M. A virtual reality based gait rehabilitation system for children with cerebral palsy. Dissertation Abstracts International: section B: the Sciences and Engineering. 2020.
  • Altun GP, Acar G. Effects of virtual reality games on upper extremity functions in children with cerebral palsy. Fizyoterapi Rehabilitasyon. 2015;26:24–P5.
  • Chrif F, van Hedel HJA, Vivian M, et al. Usability evaluation of an interactive leg press training robot for children with neuromuscular impairments. THC. 2022;30(5):1183–1197.
  • Caceres FJ, Saladino ML, Scaffa ME, et al. Neuro rehabilitation effectiveness based on virtual reality and tele rehabilitation in patients with multiple sclerosis in Argentina: ‘Reavitelem’ Study. Multiple Scler J. 2019;25:352–353.
  • Vanbeylen A, Kos D, Meurrens T, et al. Use of virtual reality glasses during physical therapy in pwMS: a feasibility study. Mult Scler J. 2021;27:15–16.
  • Antoine V, Sarah D, Maelle C, et al. The expectations and experience of PwMS after walking with the EksoGT. Mult Scler J. 2020;26:61.
  • Khalil H, Al-Sharman A, El-Salem K, et al. The development and pilot evaluation of virtual reality balance scenarios in people with multiple sclerosis (MS): a feasibility study. NeuroRehabilitation. 2018;43(4):473–482.
  • Shaw M, Palmeri M, Ladensack D, et al. Proceedings #9: immersive virtual reality rehabilitation for patients with multiple sclerosis. Brain Stimulation. 2019;12(2):e68–e9.
  • Stough D, Bethoux F, Greenberg B, et al. Physical therapy enhanced with a virtual environment: impact on ambulation, mood, and cognition. Arch Phys Med Rehabil. 2016;97(10):e25.
  • Birch N, Graham J, Priestley T. RAPPER II-Robot-Assisted PhysiotheraPy exercises with REXpowered walking aid in patients with spinal cord injury. Spine J. 2016;16(4): s70.
  • Jackowski A, Gebhard M, Thietje R. Head motion and head Gesture-Based robot control: a usability study. IEEE Trans Neural Syst Rehabil Eng. 2018;26(1):161–170.
  • Jezernik S, Scharer R, Colombo G, et al. Adaptive robotic rehabilitation of locomotion: a clinical study in spinally injured individuals. Spinal Cord. 2003;41(12):657–666.
  • Jiryaei Z, Alvar AA, Bani MA, et al. Development and feasibility of a soft pneumatic-robotic glove to assist impaired hand function in quadriplegia patients: a pilot study. J Bodyw Mov Ther. 2021;27:731–736.
  • Corbianco S, Cavallini G, Dini M, et al. Energy cost and psychological impact of robotic-assisted gait training in people with spinal cord injury: effect of two different types of devices. Neurol Sci. 2021;42(8):3357–3366.
  • Gagnon DH, Vermette M, Duclos C, et al. Satisfaction and perceptions of long-term manual wheelchair users with a spinal cord injury upon completion of a locomotor training program with an overground robotic exoskeleton. Disabil Rehabil. 2019;14:138–145.
  • Banz R, Bolliger M, Colombo G, et al. Computerized visual feedback: an adjunct to Robotic-Assisted gait training. Phys Ther. 2008;88(10):1135–1145.
  • Koljonen PA, Virk AS, Jeong Y, et al. Outcomes of a multicenter safety and efficacy study of the SuitX phoenix powered exoskeleton for ambulation by patients with spinal cord injury. Front Neurol. 2021;12:689751.
  • Sale P, Russo EF, Scarton A, et al. Training for mobility with exoskeleton robot in spinal cord injury patients: a pilot study. Eur J Phys Rehabil Med. 2018;54(5):745–751.
  • Eliav R, Rand D, Schwartz Y, et al. Training with adaptive body-controlled virtual reality following acquired brain injury for improving executive functions. Brain Injury. 2017;31:857–858.
  • Nuic D, Vinti M, Karachi C, et al. The feasibility and positive effects of a customised videogame rehabilitation programme for freezing of gait and falls in parkinson’s disease patients: a pilot study. J Neuroeng Rehab. 2018;15(1):31.
  • Olivieri I, Chiappedi M, Meriggi P, et al. Rehabilitation of children with hemiparesis: a pilot study on the use of virtual reality. Biomed Res Int. 2013;2013:695935.
  • Verier A, Drean D, Bar A, et al. Acceptability of a mobile assistance robot with a manipulating arm: evaluation in a home setting with a population of 32 functional tetraplegic patients. Ann Phys Rehabil Med. 2010;53:e22–e3.
  • Iosa M, Aydin M, Candelise C, et al. The michelangelo effect: art improves the performance in a virtual reality task developed for upper limb neurorehabilitation. Front Psychol. 2020;11:611956.
  • Meca-Lallana V, Prefasi D, Alabarcez W, et al. A pilot study to explore patient satisfaction with a virtual rehabilitation program in multiple sclerosis: the RehabVR study protocol. Front Neurol. 2020;11:900.
  • Lozano-Quilis JA, Gil-Gómez H, Gil-Gómez JA, et al. Virtual rehabilitation for multiple sclerosis using a kinect-based system: randomized controlled trial. JMIR Serious Games. 2014;2(2):e12.
  • Guillen-Climent S, Garzo A, Munoz-Alcaraz MN, et al. A usability study in patients with stroke using MERLIN, a robotic system based on serious games for upper limb rehabilitation in the home setting. J Neuroeng Rehab. 2021;18(1):41.
  • Campo-Prieto P, Rodríguez-Fuentes G, Cancela-Carral JM. Can immersive virtual reality videogames help parkinson’s disease patients? A case study. Sensors. 2021;21(14):4825.
  • Van Beek JJ, Van Wegen EE, Bohlhalter S, et al. Exergaming-based dexterity training in persons with parkinson disease: a pilot feasibility study. J Neurol Phys Ther. 2019;43(3):168–174.
  • Ciullo AS, Veerbeek JM, Temperli E, et al. A novel soft robotic supernumerary hand for severely affected stroke patients. IEEE Trans Neural Syst Rehabil Eng. 2020;28(5):1168–1177.
  • Lledo LD, Diez JA, Bertomeu-Motos A, et al. A comparative analysis of 2D and 3D tasks for virtual reality therapies based on Robotic-Assisted neurorehabilitation for post-stroke patients. Front Aging Neurosci. 2016;8:205.
  • Neil A, Ens S, Pelletier R, et al. Sony PlayStation EyeToy elicits higher levels of movement than the nintendo wii: implications for stroke rehabilitation. Europ J Phys Rehabil Med. 2013;49:13–21.
  • Weber LM, Nilsen DM, Gillen G, et al. Immersive virtual reality mirror therapy for upper limb recovery after stroke: a pilot study. Am J Phys Med Rehabil. 2019;98(9):783–788.
  • Vanbellingen T, Filius SJ, Nyffeler T, et al. Usability of Videogame-Based dexterity training in the early rehabilitation phase of stroke patients: a pilot study. Front Neurol. 2017;8:654.
  • Singh N, Saini M, Anand S, et al. Robotic exoskeleton for wrist and fingers joint in Post-Stroke Neuro-Rehabilitation for Low-Resource settings. IEEE Trans Neural Syst Rehabil Eng. 2019;27(12):2369–2377.
  • Proffitt R, Lange B, Sevick M. Abstract W P135: usability of a virtual reality tool for in-Home stroke rehabilitation: a case series. Stroke. 2015;46(suppl_1):AWP135–AWP.
  • Lambelet C, Temiraliuly D, Siegenthaler M, et al. Characterization and wearability evaluation of a fully portable wrist exoskeleton for unsupervised training after stroke. J Neuroeng Rehab. 2020;17(1):132.
  • Kizony R, Weiss PL, Shahar M, et al. TheraGame: a home based virtual reality rehabilitation system. Inter J Disabil Hum Develop. 2006;5(3):265–269.
  • Tamplin J, Loveridge B, Clarke K, et al. Development and feasibility testing of an online virtual reality platform for delivering therapeutic group singing interventions for people living with spinal cord injury. J Telemed Telecare. 2020;26(6):365–375.
  • Meldrum D, Glennon A, Herdman S, et al. Virtual reality rehabilitation of balance: assessment of the usability of the nintendo wii(®) fit Plus. Disabil Rehabil Assist Technol. 2012;7(3):205–210.
  • Bethoux F, Stough D, Sutliff M, et al. Pilot study of powered exoskeleton use for gait rehabilitation in individuals with multiple sclerosis: feasibility, safety, and effects on gait parameters. Arch Phys Med Rehabil. 2020;101(11):e70.
  • Gomez-Vargas D, Ballen-Moreno F, Barria P, et al. The actuation system of the ankle exoskeleton t-flex: first use experimental validation in people with stroke. Brain Sci. 2021;11(4):412.
  • Kilbride C, Scott DJM, Butcher T, et al. Safety, feasibility, acceptability and preliminary effects of the neurofenix platform for rehabilitation via HOMe based gaming exercise for the upper-limb post stroke (RHOMBUS): results of a feasibility intervention study. BMJ Open. 2022;12(2):e052555.
  • Sevit R, Sels R, Bukenbergs S, et al. ARTHE: development of an upper limb active smart wearable orthosis for stroke therapy. Technol Disabil. 2019;31:S75–S6.
  • Demers M, Chan Chun Kong D, Levin MF. Feasibility of incorporating functionally relevant virtual rehabilitation in Sub-acute stroke care: perception of patients and clinicians. Disabil Rehabil Assist Technol. 2019;14(4):361–367.
  • Shahar N, Schwartz I, Portnoy S. Differences in muscle activity and fatigue of the upper limb between Task-Specific training and robot assisted training among individuals post stroke. J Biomech. 2019;89:28–33.
  • Lloréns R, Gil-Gómez JA, Alcañiz M, et al. Improvement in balance using a virtual reality-based stepping exercise: a randomized controlled trial involving individuals with chronic stroke. Clin Rehabil. 2015;29(3):261–268.
  • Llorens R, Colomer-Font C, Alcaniz M, et al. BioTrak virtual reality system: effectiveness and satisfaction analysis for balance rehabilitation in patients with brain injury. Neurologia. 2013;28(5):268–275.
  • Swinnen E, Lefeber N, Willaert W, et al. Motivation, expectations, and usability of a driven gait orthosis in stroke patients and their therapists. Top Stroke Rehabil. 2017;24(4):299–308.
  • Gorsic M, Cikajlo I, Goljar N, et al. A multisession evaluation of an adaptive competitive arm rehabilitation game. J Neuroeng Rehab. 2017;14(1):128.
  • Boone AE, Wolf TJ, Engsberg JR. Combining virtual reality motor rehabilitation with cognitive strategy use in chronic stroke. Am J Occup Ther. 2019;73(4):7304345020p1–7304345020p9.
  • Held JP, Ferrer B, Mainetti R, et al. Autonomous rehabilitation at stroke patients home for balance and gait: safety, usability and compliance of a virtual reality system. Eur J Phys Rehabil Med. 2018;54(4):545–553. and others.
  • Tobler-Ammann BC, Surer E, Knols RH, et al. User perspectives on exergames designed to explore the hemineglected space for stroke patients with visuospatial neglect: usability study. JMIR Serious Games. 2017;5(3):e18.
  • Archambault P, Norouzi-Gheidari N, Tao G, et al. Use of exergames for upper extremity rehabilitation in stroke patients. Ann Phys Rehabil Med. 2015;58:e97–e8.
  • O’’Shea R, Broderick M, Bentley P. Adoption of rehabilitation technologies depends on technology usability and acceptability amongst stroke survivors. Europ Stroke J. 2021;6:295.
  • Buccellato KH, Nordstrom M, Murphy JM, et al. A randomized feasibility trial of a novel, integrative, and intensive virtual rehabilitation program for service members Post-Acquired brain injury. Mil Med. 2020;185(1–2):e203–e11. others
  • Carpinella I, Cattaneo D, Bonora G, et al. Wearable Sensor-Based biofeedback training for balance and gait in parkinson disease: a pilot randomized controlled trial. Arch Phys Med Rehabil. 2017;98(4):622–630.e3.
  • Crosbie JH, Lennon S, McNeill MDJ, et al. Virtual reality in the rehabilitation of the upper limb after stroke: the user’s perspective. Cyberpsychol Behav. 2006;9(2):137–141.
  • Lewis GN, Woods C, Rosie JA, et al. Virtual reality games for rehabilitation of people with stroke: perspectives from the users. Disabil Rehabil Assist Technol. 2011;6(5):453–463.
  • Lupinacci G, Gatti G, Melegari C, et al. Interactive design of patient-oriented video-games for rehabilitation: concept and application. Disabil Rehabil Assist Technol. 2018;13(3):234–244.
  • Carifio J, Perla R. Resolving the 50-year debate around using and misusing likert scales. Med Educ. 2008;42(12):1150–1152.
  • Dolnicar S. 5/7-point “likert scales” aren’t always the best option: their validity is undermined by lack of reliability, response style bias, long completion times and limitations to permissible statistical procedures. Ann Tourism Res. 2021;91:103297.
  • Bangor A, Kortum PT, Miller JT. An empirical evaluation of the system usability scale. Inter J Hum Comp Interact. 2008;24(6):574–594.
  • Lewis JR. The system usability scale: past, present, and future. Inter J Hum Comp Interact. 2018;34(7):577–590.
  • Folador JP, Souza Santos MC, Oliveira Andrade A. System usability scale (SUS) applied to a webbased integrated system for the management of data from people with parkinson’s disease. In: XII Simpósio De Engenharia Biomédica - Ix Simpósio De Instrumentação E Imagens Médicas, Uberlândia city, Brazil. 2019.
  • Demers L, Weiss-Lambrou R, Ska B. Item analysis of the Quebec user evaluation of satisfaction with assistive technology (QUEST). Assist Technol. 2000;12(2):96–105.
  • Demers L, Monette M, Lapierre Y, et al. Reliability, validity, and applicability of the Quebec user evaluation of satisfaction with assistive technology (QUEST 2.0) for adults with multiple sclerosis. Disabil Rehabil. 2002;24(1–3):21–30.
  • Demers L, Weiss-Lambrou R, Ska B. Development of the Quebec user evaluation of satisfaction with assistive technology (QUEST). Assist Technol. 1996;8(1):3–13.
  • Atigossou OLG, Honado AS, Routhier F, et al. Psychometric properties of the psychosocial impact of assistive devices scale (PIADS): a systematic review. Assistive Technol. 2022;
  • Gao M, Kortum P, Oswald F. Psychometric evaluation of the USE (usefulness, satisfaction, and ease of use) questionnaire for reliability and validity. Proc Hum Factors Ergon Soc Ann Meeting. 2018;62(1):1414–1418.
  • Morgan DG, Kosteniuk J, Stewart N, et al. The telehealth satisfaction scale: reliability, validity, and satisfaction with telehealth in a rural memory clinic population. Telemed J E Health. 2014;20(11):997–1003.
  • Xiao YM, Wang ZM, Wang MZ, et al. The appraisal of reliability and validity of subjective workload assessment technique and NASA-task load index. Zhonghua Lao Dong Wei Sheng Zhi ye Bing za Zhi = Zhonghua Laodong Weisheng Zhiyebing Zazhi = Chinese J Industr Hygiene Occupat Dis. 2005;23:178–181.
  • Devos H, Gustafson K, Ahmadnezhad P, et al. Psychometric properties of NASA-TLX and index of cognitive activity as measures of cognitive workload in older adults. Brain Sci. 2020;
  • Larsen DL, Attkisson CC, Hargreaves WA, et al. Assessment of client/patient satisfaction: development of a general scale. Eval Program Plann. 1979;2(3):197–207.
  • Kelly PJ, Kyngdon F, Ingram I, et al. The client satisfaction questionnaire-8: psychometric properties in a cross-sectional survey of people attending residential substance abuse treatment. Drug Alcohol Rev. 2018;37(1):79–86.