365
Views
0
CrossRef citations to date
0
Altmetric
Research Article

The impact of visual and motor space size on gaze-based target selection

ORCID Icon, , , ORCID Icon & ORCID Icon
Article: 2309384 | Received 25 Jul 2023, Accepted 16 Jan 2024, Published online: 05 Feb 2024

References

  • Agustin, J. S., Mateo, J. C., Hansen, J. P., & Villanueva, A. (2009). Evaluation of the potential of gaze input for game interaction. PsychNology Journal, 7(2), 213–17.
  • Aoki, H., Hansen, J. P., & Itoh, K. (2008). Learning to interact with a computer by gaze. Behaviour & Information Technology, 27(4), 339–344. https://doi.org/10.1080/01449290701760641
  • Bates, R., & Istance, H. (2002, July). Zooming interfaces! Enhancing the performance of eye controlled pointing devices. Proceedings of the Fifth International ACM Conference on Assistive Technologies (pp. 119–126). Association for Computing Machinery. https://doi.org/10.1145/638249.638272
  • Bates, D., Mächler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67(1), 1–48. https://doi.org/10.18637/jss.v067.i01
  • Bieg, H. J., Chuang, L. L., Fleming, R. W., Reiterer, H., & Bülthoff, H. H. (2010, March). Eye and pointer coordination in search and selection tasks. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (pp. 89–92). Association for Computing Machinery. https://doi.org/10.1145/1743666.1743688
  • Borji, A., Lennartz, A., & Pomplun, M. (2015). What do eyes reveal about the mind?: Algorithmic inference of search targets from fixations. Neurocomputing, 149, 788–799. https://doi.org/10.1016/j.neucom.2014.07.055
  • Canare, D., Chaparro, B., & He, J. (2015, August). A comparison of gaze-based and gesture-based input for a point-and-click task. International Conference on Universal Access in Human-computer Interaction (pp. 15–24). Springer, Cham. https://doi.org/10.1007/978-3-319-20681-3_2
  • Choi, M., Sakamoto, D., & Ono, T. (2020, June). Bubble gaze cursor+ bubble gaze lens: Applying area cursor technique to eye-gaze interface. In ACM Symposium on Eye Tracking Research and Applications (pp. 1–10). Association for Computing Machinery. https://doi.org/10.1145/3379155.3391322
  • Chun-Yan, K., Zhi-Hao, L., Shao-Yao, Z., Hong-Ting, L., Chun-Hui, W., & Tian, Y. (2020). Influence of target size and distance to target on performance of multiple pointing techniques. Space Medicine & Medical Engineering, 33(2), 112–119. https://doi.org/10.16289/j.cnki.1002-0837.2020.02.004
  • Cockburn, A., & Brock, P. (2006). Human on-line response to visual and motor target expansion. Proceedings of the Graphics Interface 2006, Quebec City, Canada (pp. 81–87). Canadian Information Processing Society.
  • Delamare, W., Janssoone, T., Coutrix, C., & Nigay, L. (2016, June). Designing 3D gesture guidance: Visual feedback and feedforward design options. Proceedings of the International Working Conference on Advanced Visual Interfaces (pp. 152–159). Association for Computing Machinery. https://doi.org/10.1145/2909132.2909260
  • Desmurget, M., & Grafton, S. (2000). Forward modeling allows feedback control for fast reaching movements. Trends in Cognitive Sciences, 4(11), 423–431. https://doi.org/10.1016/S1364-6613(00)01537-0
  • Djajadiningrat, T., Overbeeke, K., & Wensveen, S. (2002, June). But how, Donald, tell us how? On the creation of meaning in interaction design through feedforward and inherent feedback. Proceedings of the 4th Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques (pp. 285–291). Association for Computing Machinery. https://doi.org/10.1145/778712.778752
  • Elliott, D., Dutoy, C., Andrew, M., Burkitt, J. J., Grierson, L. E., Lyons, J. L., & Bennett, S. J.(2014). The influence of visual feedback and prior knowledge about feedback on vertical aiming strategies. Journal of Motor Behavior, 46(6), 433–443. https://doi.org/10.1080/00222895.2014.933767
  • Epelboim, J., Steinman, R. M., Kowler, E., Edwards, M., Pizlo, Z., Erkelens, C. J., & Collewijn, H. (1995). The function of visual search and memory in sequential looking tasks. Vision Research, 35(23–24), 3401–3422. https://doi.org/10.1016/0042-6989(95)00080-X
  • Feng Cheng-Zhi, S. M.-W., & Hui, S. (2003). Spatial and temporal characteristic of eye movement in human-computer interface design. Space Medicine & Medical Engineering, 16(4), 304–306. https://doi.org/10.1023/A:1022080713267
  • Guillon, M., Leitner, F., & Nigay, L. (2014, May). Static voronoi-based target expansion technique for distant pointing. Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces (pp. 41–48). Association for Computing Machinery. https://doi.org/10.1145/2598153.2598178
  • Guillon, M., Leitner, F., & Nigay, L. (2015). Investigating Visual Feedforward for Target Expansion Techniques. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI ’15 (pp. 2777–2786). Association for Computing Machinery. https://doi.org/10.1145/2702123.2702375
  • Gutwin, C. (2002, April). Improving focus targeting in interactive fisheye views. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 267–274). Association for Computing Machinery. https://doi.org/10.1145/503376.503424
  • Haith, A. M., & Krakauer, J. W. (2018). The multiple effects of practice: Skill, habit and reduced cognitive load. Current Opinion in Behavioral Sciences, 20, 196–201. https://doi.org/10.1016/j.cobeha.2018.01.015
  • Hansen, D. W., Skovsgaard, H. H., Hansen, J. P., & Møllenbach, E. (2008, March). Noise tolerant selection by gaze-controlled pan and zoom in 3D. Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (pp. 205–212). Association for Computing Machinery. https://doi.org/10.1145/1344471.1344521
  • Hardwick, R. M., Forrence, A. D., Krakauer, J. W., & Haith, A. M. (2019). Time-dependent competition between goal-directed and habitual response preparation. Nature Human Behaviour, 3(12), 1252–1262. https://doi.org/10.1038/s41562-019-0725-0
  • Hart, S. G., & Staveland, L. E. (1988). Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In Advances in psychology (Vol. 52, pp. 139–183). North-Holland. https://doi.org/10.1016/S0166-4115(08)62386-9
  • Hayhoe, M., & Ballard, D. (2005). Eye movements in natural behavior. Trends in Cognitive Sciences, 9(4), 188–194. https://doi.org/10.1016/j.tics.2005.02.009
  • Hyrskykari, A., Istance, H., & Vickers, S. (2012, March). Gaze gestures or dwell-based interaction? Proceedings of the Symposium on Eye Tracking Research and Applications (pp. 229–232). Association for Computing Machinery. https://doi.org/10.1145/2168556.2168602
  • Kleinke, C. L. (1986). Gaze and eye contact: A research review. Psychological Bulletin, 100(1), 78. https://doi.org/10.1037/0033-2909.100.1.78
  • Kumar, C., Menges, R., Müller, D., & Staab, S. (2017, April). Chromium based framework to include gaze interaction in web browser. Proceedings of the 26th International Conference on World Wide Web Companion (pp. 219–223). International World Wide Web Conferences Steering Committee. https://doi.org/10.1145/3041021.3054730
  • Kumar, M., Paepcke, A., & Winograd, T. (2007, April). Eyepoint: Practical pointing and selection using gaze and keyboard. Proceedings of the SIGCHI Conference on Human factors in Computing Systems (pp. 421–430). Association for Computing Machinery. https://doi.org/10.1145/1240624.1240692
  • MacKenzie, I. S. (2018). Fitts’ law. In K. L. Norman & J. Kirakowski (Eds.), Handbook of human-computer interaction (pp. 349–370). Wiley. https://doi.org/10.1002/9781118976005.ch17
  • McGuffin, M. J., & Balakrishnan, R. (2005). Fitts’ law and expanding targets: Experimental studies and designs for user interfaces. ACM Transactions on Computer-Human Interaction (TOCHI), 12(4), 388–422. https://doi.org/10.1145/1121112.1121115
  • Miniotas, D., Špakov, O., & MacKenzie, I. S. (2004, April). Eye gaze interaction with expanding targets. CHI’04 Extended Abstracts on Human Factors in Computing Systems, 1255–1258. https://doi.org/10.1145/985921.986037
  • Miyoshi, T., & Murata, A. (2001, October). Usability of input device using eye tracker on button size, distance between targets and direction of movement. In 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat. No. 01CH37236) (Vol. 1, pp. 227–232). IEEE. https://doi.org/10.1109/ICSMC.2001.969816
  • Morimoto, C. H., Leyva, J. A. T., & Diaz-Tula, A. (2018). Context switching eye typing using dynamic expanding targets. Proceedings of the Workshop on Communication by Gaze Interaction - COGAIN ’18. Association for Computing Machinery. https://doi.org/10.1145/3206343.3206347
  • Munoz, D. P., & Everling, S. (2004). Look away: The anti-saccade task and the voluntary control of eye movement. Nature Reviews Neuroscience, 5(3), 218–228. https://doi.org/10.1038/nrn1345
  • Peternel, L., Sigaud, O., & Babič, J. (2017). Unifying speed-accuracy trade-off and cost-benefit trade-off in human reaching movements. Frontiers in Human Neuroscience, 11, 615. https://doi.org/10.3389/fnhum.2017.00615
  • Qvarfordt, P. (2017). Gaze-informed multimodal interaction. The handbook of multimodal-multisensor interfaces: Foundations, user modeling, and common modality combinations-volume 1 (pp. 365–402). https://doi.org/10.1145/3015783.3015794
  • Rantala, J., Majaranta, P., Kangas, J., Isokoski, P., Akkil, D., Špakov, O., & Raisamo, R. (2020). Gaze interaction with vibrotactile feedback: Review and design guidelines. Human–Computer Interaction, 35(1), 1–39. https://doi.org/10.1080/07370024.2017.1306444
  • Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372–422. https://doi.org/10.1037/0033-2909.124.3.372
  • Reichenthal, M., Avraham, G., Karniel, A., & Shmuelof, L. (2016). Target size matters: Target errors contribute to the generalization of implicit visuomotor learning. Journal of Neurophysiology, 116(2), 411–424. https://doi.org/10.1152/jn.00830.2015
  • Seidler, R. D., Noll, D. C., & Thiers, G. (2004). Feedforward and feedback processes in motor control. Neuroimage: Reports, 22(4), 1775–1783. https://doi.org/10.1016/j.neuroimage.2004.05.003
  • Sibert, L. E., Templeman, J. N., & Jacob, R. J. (2001). Evaluation and analysis of eye gaze interaction (No. NRL/FR/5513–01-9990). Naval Research Lab
  • Souto, D., Marsh, O., Hutchinson, C., Judge, S., & Paterson, K. B. (2021). Cognitive plasticity induced by gaze-control technology: Gaze-typing improves performance in the antisaccade task. Computers in Human Behavior, 122, 106831. https://doi.org/10.1016/j.chb.2021.106831
  • Špakov, O., & Miniotas, D. (2005, October). Gaze-based selection of standard-size menu items. Proceedings of the 7th International Conference on Multimodal Interfaces (pp. 124–128). Association for Computing Machinery. https://doi.org/10.1145/1088463.1088486
  • Squire, P., Mead, P., Smith, M., Coons, R., & Popola, A. (2013). Quick-eye: Examination of human performance characteristics using Eye tracking and manual-based control systems for monitoring multiple displays. Naval Surface Warfare Center.
  • Stellmach, S., & Dachselt, R. (2013, April). Still looking: Investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. Proceedings of the Sigchi Conference on Human Factors in Computing Systems (pp. 285–294). Association for Computing Machinery. https://doi.org/10.1145/2470654.2470695
  • Su, X., Au, O. K. C., & Lau, R. W. (2014, April). The implicit fan cursor: A velocity dependent area cursor. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 753–762). Association for Computing Machinery. https://doi.org/10.1145/2556288.2557095
  • Usuba, H., Yamanaka, S., & Miyashita, H. (2018). Pointing to targets with difference between motor and visual widths. Proceedings of the 30th Australian Conference on Computer-Human Interaction (pp.374–383). Association for Computing Machinery. https://doi.org/10.1145/3292147.3292150
  • Usuba, H., Yamanaka, S., & Miyashita, H. (2020, December). A model for pointing at targets with different clickable and visual widths and with distractors. 32nd Australian Conference on Human-Computer Interaction (pp. 1–10). Association for Computing Machinery. https://doi.org/10.1145/3441000.3441019
  • Vermeulen, J., Luyten, K., van den Hoven, E., & Coninx, K. (2013, April). Crossing the bridge over Norman’s gulf of execution: Revealing feedforward’s true identity. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1931–1940). Association for Computing Machinery. https://doi.org/10.1145/2470654.2466255
  • Vertegaal, R. (2008, October). A Fitts law comparison of eye tracking and manual input in the selection of visual targets. Proceedings of the 10th International Conference on Multimodal Interfaces (pp. 241–248). Association for Computing Machinery. https://doi.org/10.1145/1452392.1452443
  • Zhai, S., Conversy, S., Beaudouin-Lafon, M., & Guiard, Y. (2003, April). Human on-line response to target expansion. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 177–184). Association for Computing Machinery. https://doi.org/10.1145/642611.642644
  • Zhang, X., & MacKenzie, I. S. (2007). Evaluating eye tracking with ISO 9241-part 9. In Human-Computer Interaction. HCI Intelligent Multimodal Interaction Environments: 12th International Conference, HCI International 2007, Beijing, China, July 22-27, 2007, Proceedings, Part III 12 (pp. 779–788). Springer.