443
Views
3
CrossRef citations to date
0
Altmetric
Articles

A robust, real-time camera-based eye gaze tracking system to analyze users’ visual attention using deep learning

&
Pages 409-430 | Received 11 Nov 2021, Accepted 04 Jun 2022, Published online: 22 Jun 2022

References

  • Brousseau, B., Rose, J., & Eizenman, M. (2020). Hybrid eye-tracking on a smartphone with CNN feature extraction and an infrared 3D model. Sensors, 20(2), 543. sci-hub.tw/10.3390/s20020543
  • Burch, M., Jalba, A., & den Hollander, C. D. (2021). Convolutional neural networks for real-time eye tracking in interactive applications. In Bryan Christiansen & Tihana Skrinjaric (Eds.), Handbook of research on Applied AI for international business and marketing applications (pp. 455–473). IGI Global.
  • Chen, Z., & Shi, B. E. (2018). Appearance-based gaze estimation using dilated-convolutions. Asian Conference on Computer Vision, 11366 LNCS, December 2-6, 309–324. sci-hub.tw/10.1007/978-3-030-20876-9_20
  • Cheng, Y., Huang, S., Wang, F., Qian, C., & Lu, F. (2020). A coarse-to-fine adaptive network for appearance-based gaze estimation. Proceedings of the AAAI Conference on Artificial Intelligence, February 7-20, 10623–10630.
  • Cheng, Y., Lu, F., & Zhang, X. (2018). Appearance-based gaze estimation via evaluation-guided asymmetric regression. Proceedings of the European Conference on Computer Vision (ECCV), September 8-14, 100–115.
  • Cheng, Y., Zhang, X., Lu, F., & Sato, Y. (2020). Gaze estimation by exploring two-eye asymmetry. IEEE Transactions on Image Processing, 29(1), 5259–5272. sci-hub.tw/10.1109/TIP.2020.2982828
  • Choe, K. W., Blake, R., & Lee, S. H. (2016). Pupil size dynamics during fixation impact the accuracy and precision of video-based gaze estimation. Vision Research, sci-hub.tw/10.1016/j.visres.2014.12.018
  • Dimpfel, W. (2015). Neuromarketing: Neurocode-tracking in combination with Eye-tracking for quantitative objective assessment of TV commercials. Journal of Behavioral and Brain Science, 5(4), 137–147. sci-hub.tw/10.4236/jbbs.2015.54014
  • Drakopoulos, P., Koulieris, G., & Mania, K. (2021). Eye tracking interaction on unmodified mobile VR headsets using the selfie camera. ACM Transactions on Applied Perception, 18(3), 1–20. sci-hub.tw/10.1145/3456875
  • Farnsworth, B. (2019). Eye Tracker Prices . https://imotions.com/blog/eye-tracker-prices/
  • Fischer, T., Chang, H. J., & Demiris, Y. (2018). RT-GENE: Real-time eye gaze estimation in natural environments. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 11214 LNCS, 339–357. sci-hub.tw/10.1007/978-3-030-01249-6_21
  • George, A., & Routray, A. (2016a). Fast and accurate algorithm for eye localization for Gaze Tracking in low resolution images. 1–12. sci-hub.tw/10.1049/iet-cvi.2015.0316
  • George, A., & Routray, A. (2016b). Real-time eye gaze direction classification using convolutional neural network. 2016 International Conference on Signal Processing and Communications (SPCOM), June 12-15, 1–5.
  • Guestrin, E. D., & Eizenman, M. (2006). General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on Biomedical Engineering, 53(6), 1124–1133. sci-hub.tw/10.1109/TBME.2005.863952
  • Hansen, D. W., & Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478–500. sci-hub.tw/10.1109/TPAMI.2009.30
  • Hornof, A., Cavender, A., & Hoselton, R. (2003). Eyedraw: A system for drawing pictures with eye movements. ACM SIGACCESS Accessibility and Computing, 86–93. sci-hub.tw/10.1145/1029014.1028647
  • Huang, Q., Veeraraghavan, A., & Sabharwal, A. (2017). Tabletgaze: Dataset and analysis for unconstrained appearance-based gaze estimation in mobile tablets. Machine Vision and Applications, 28(5–6), 445–461. sci-hub.tw/10.1007/s00138-017-0852-4
  • Ince, I. F., & Kim, J. W. (2011). A 2D eye gaze estimation system with low- resolution webcam images. EURASIP Journal on Advances in Signal Processing, 40(1), 1–11. sci-hub.tw/10.1186/1687-6180-2011-40
  • Jafari, R., & Ziou, D. (2015). Eye-gaze estimation under various head positions and iris states. Expert Systems with Applications, 42(1), 510–518. sci-hub.tw/10.1016/j.eswa.2014.08.003
  • Jiang, J., Guo, F., Chen, J., Tian, X., & Lv, W. (2019). Applying eye-tracking technology to measure interactive experience toward the navigation interface of mobile games considering different visual attention. Applied Sciences, 9(16), 3242. sci-hub.tw/10.3390/app9163242
  • Kanade, P., David, F., & Kanade, S. (2021). Convolutional neural networks (CNN) based Eye-Gaze Tracking System using Machine Learning Algorithm. European Journal of Electrical Engineering and Computer Science, 5(2), 36–40. sci-hub.tw/10.24018/ejece.2021.5.2.314
  • Kang, Z., & Landry, S. J. (2015). An eye movement analysis algorithm for a multielement target tracking task: Maximum transition-based agglomerative hierarchical clustering. IEEE Transactions on Human-Machine Systems, 45(1), 13–24. sci-hub.tw/10.1109/THMS.2014.2363121
  • Kowalczyk, P., & Sawicki, D. (2019). Blink and wink detection as a control tool in multimodal interaction. Multimedia Tools and Applications, 78(10), 13749–13765. sci-hub.tw/10.1007/S11042-018-6554-8
  • Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., & Torralba, A. (2016). Eye Tracking for Everyone. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2176–2184. sci-hub.tw/10.1109/CVPR.2016.239
  • Kruthiventi, S. S. S., Ayush, K., & Babu, R. V. (2017). Deepfix: A fully convolutional neural network for predicting human eye fixations. IEEE Transactions on Image Processing, sci-hub.tw/10.1109/TIP.2017.2710620
  • Kumar, D., & Sharma, A. (2016). Electrooculogram-based virtual reality game control using blink detection and gaze calibration. 2016 International Conference on Advances in Computing, Communications and Informatics, ICACCI 2016, 2358–2362. sci-hub.tw/10.1109/ICACCI.2016.7732407
  • Kurilovas, E., & Kubilinskiene, S. (2020). Lithuanian case study on evaluating suitability, acceptance and use of IT tools by students–An example of applying technology enhanced learning research methods in higher education. Computers in Human Behavior, 107, 106274. sci-hub.tw/10.1016/j.chb.2020.106274
  • Lian, D., Hu, L., Luo, W., Xu, Y., & Duan, L. (2018). Multiview multitask gaze estimation with deep convolutional neural networks. IEEE Transactions on Neural Networks and Learning Systems, 30(10), 3010–3023. sci-hub.tw/10.1109/TNNLS.2018.2865525
  • Liu, S. S., Rawicz, A., Ma, T., Zhang, C., Lin, K., Rezaei, S., & Wu, E. (2012). An eye-gaze tracking and human computer interface system for people with ALS and other locked-in diseases. Journal of medical and biological engineering, 32(2), 111–116.
  • Liu, T. S. W., Liu, Y. T., & Chen, C. Y. D. (2019). Meaningfulness is in the eye of the reader: Eye-tracking insights of L2 learners reading e-books and their pedagogical implications. Interactive Learning Environments, 27(2), 181–199. sci-hub.tw/10.1080/10494820.2018.1451901
  • Lu, F., Okabe, T., Sugano, Y., & Sato, Y. (2014). Learning gaze biases with head motion for head pose-free gaze estimation. Image and Vision Computing, 32(3), 169–179. sci-hub.tw/10.1016/j.imavis.2014.01.005
  • Ma, C., Choi, K.-A., Choi, B.-D., & Ko, S.-J. (2015). Robust remote gaze estimation method based on multiple geometric transforms. Optical Engineering, 54(8), 83103. sci-hub.tw/10.1117/1.OE.54.8.083103
  • Mason, M. F., Hood, B. M., & Macrae, C. N. (2004). Look into my eyes: Gaze direction and person memory. Memory, 12(5), 637–643. sci-hub.tw/10.1080/09658210344000152
  • Mazhar, O., Shah, T. A., Khan, M. A., & Tehami, S. (2015). A real-time webcam based Eye Ball Tracking System using MATLAB. 2015 IEEE 21st International Symposium for Design and Technology in Electronic Packaging, SIITME 2015, October, 139–142. sci-hub.tw/10.1109/SIITME.2015.7342312
  • Meena, Y. K., Cecotti, H., Wong-Lin, K., Dutta, A., & Prasad, G. (2018). Toward optimization of gaze-controlled human-computer interaction: Application to Hindi virtual keyboard for stroke patients. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 26(4), 911–922. sci-hub.tw/10.1109/TNSRE.2018.2814826
  • Mele, M. L., & Federici, S. (2012). Gaze and eye-tracking solutions for psychological research. Cognitive Processing, 13(1), 261–265. sci-hub.tw/10.1007/s10339-012-0499-z
  • Meng, C., & Zhao, X. (2017). Webcam-based eye movement analysis using CNN. IEEE Access, 5(1), 19581–19587. sci-hub.tw/10.1109/ACCESS.2017.2754299
  • Modi, N., & Singh, J. (2020). A survey of research trends in assistive technologies using information modelling techniques. Disability and Rehabilitation: Assistive Technology, 15(1), 1–19. sci-hub.tw/10.1080/17483107.2020.1817992
  • Modi, N., & Singh, J.. (2021). A review of various state of art eye gaze estimation techniques. Advances in Computational Intelligence and Communication Technology, February 22, 501–510. sci-hub.tw/10.1007/978-981-15-1275-9_41
  • Mou, J., & Shin, D. (2018). Effects of social popularity and time scarcity on online consumer behaviour regarding smart healthcare products: An eye-tracking approach. Computers in Human Behavior, 78(1), 74–89. sci-hub.tw/10.1016/j.chb.2017.08.049
  • Pantic, M., Pentland, A., Nijholt, A., & Huang, T. S. (2007). Human computing and machine understanding of human behavior: A survey. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 4451(1), 47–71. sci-hub.tw/10.1007/978-3-540-72348-6_3
  • Pilarcikova, K., Rusnak, P., & Rabcan, J. (2019). User experience in the development of the education system. 17th International Conference on Emerging ELearning Technologies and Applications (ICETA), November 21-22, 626–632.
  • Raj, R., & Joseph, N. (2016). Keypoint extraction using SURF algorithm for CMFD. Procedia Computer Science, 93(1), 375–381. sci-hub.tw/10.1016/j.procs.2016.07.223
  • Rakoczi, G., & Pohl, M. (2012). Visualisation and analysis of multiuser gaze data: Eye tracking usability studies in the special context of e-learning. Proceedings of the 12th IEEE International Conference on Advanced Learning Technologies, ICALT 2012. sci-hub.tw/10.1109/ICALT.2012.15
  • Ranjan, R., De Mello, S., & Kautz, J. (2018). Light-weight head pose invariant gaze tracking. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, June 18-22, 2156–2164.
  • Rappa, N. A., Ledger, S., Teo, T., Wai Wong, K., Power, B., & Hilliard, B. (2019). The use of eye tracking technology to explore learning and performance within virtual reality and mixed reality settings: A scoping review. Interactive Learning Environments, 27(1), 1–13. sci-hub.tw/10.1080/10494820.2019.1702560
  • Ribes, M. T., & Pallejà Cabrè, T. (2019). Optical mouse sensor for Eye blink detection and pupil tracking: Application in a low-cost eye-controlled pointing device. Journal of Sensors, 2019(1), 1–19. http://sci-hub.tw/10.1155/2019/3931713
  • Singh, J., & Modi, N. (2019). Use of information modelling techniques to understand research trends in eye gaze estimation methods: An automated review. Heliyon, 5(12), e03033. https://doi.org/10.1016/j.heliyon.2019.e03033
  • Skodras, E., Kanas, V. G., & Fakotakis, N. (2015). On visual gaze tracking based on a single low cost camera. Signal Processing: Image Communication, 36(1), 29–42. sci-hub.tw/10.1016/j.image.2015.05.007
  • Spiller, M., Liu, Y. H., Hossain, M. Z., Gedeon, T., Geissler, J., & Nürnberger, A. (2021). Predicting visual search task success from Eye gaze data as a basis for user-adaptive information visualization systems. ACM Transactions on Interactive Intelligent Systems, 11(2), 1–25. sci-hub.tw/10.1145/3446638
  • Sulikowski, P., & Zdziebko, T. (2020). Deep learning-enhanced framework for performance evaluation of a recommending interface with varied recommendation position and intensity based on eye-tracking equipment data processing. Electronics, 9(2), 266. sci-hub.tw/10.3390/electronics9020266
  • Toreini, P., Langner, M., & Maedche, A. (2020). Using eye-tracking for visual attention feedback. In Fred D. Davis, René Riedl, Jan vom Brocke, Pierre- Majorique Léger, Adriane Randolph, & Thomas Fischer (Eds.), Information Systems and neuroscience (pp. 261–270). Springer.
  • Toshev, A., & Szegedy, C. (2014). Deeppose: Human pose estimation via deep neural networks. The IEEE Conference on Computer Vision and Pattern Recognition (CVPR). sci-hub.tw/10.1109/CVPR.2014.214
  • Turner, J., Bulling, A., & Gellersen, H. (2012). Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA ‘12, 269–272. sci-hub.tw/10.1145/2168556.2168613
  • Valenti, R., & Gevers, T. (2008). Accurate eye center location and tracking using isophote curvature. 26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR. sci-hub.tw/10.1109/CVPR.2008.4587529
  • Wang, C. C., Hung, J. C., Chen, S. N., & Chang, H. P. (2018a). Tracking students’ visual attention on manga-based interactive e-book while reading: An eye-movement approach. Multimedia Tools and Applications, 78(4), 4813–4834. sci-hub.tw/10.1007/s11042-018-5754-6
  • Wang, K., & Ji, Q. (2018). 3D gaze estimation without explicit personal calibration. Pattern Recognition, 79(1), 216–227. sci-hub.tw/10.1016/j.patcog.2018.01.031
  • Wang, Y., Zhao, T., Ding, X., Peng, J., Bian, J., & Fu, X. (2018). Learning a gaze estimator with neighbor selection from large-scale synthetic eye images. Knowledge-Based Systems, 139(1), 41–49. sci-hub.tw/10.1016/j.knosys.2017.10.010
  • Wilson, P. I., & Fernandez, J. (2006). Facial feature detection using Haar classifiers. Journal of Computing Sciences in Colleges, 21(4), 127–133. sci-hub.tw/10.1109/CVPR.2001.990517
  • Wood, E., Baltrušaitis, T., Morency, L.-P., Robinson, P., & Bulling, A. (2016a). Learning an appearance-based gaze estimator from one million synthesised images. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications - ETRA ‘16. sci-hub.tw/10.1145/2857491.2857492
  • Wood, E., Baltrušaitis, T., Morency, L. P., Robinson, P., & Bulling, A. (2016b). A 3D morphable eye region model for gaze estimation. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 297–313. sci-hub.tw/10.1007/978-3-319-46448-0_18
  • Wu, Y. L., Yeh, C. T., Hung, W. C., & Tang, C. Y. (2014). Gaze direction estimation using support vector machine with active appearance model. Multimedia Tools and Applications, 70(3), 2037–2062. sci-hub.tw/10.1007/s11042-012-1220-z
  • Xiong, J., & Zuo, M. (2020). What does existing NeuroIS research focus on? Information Systems, 89, 101462. sci-hub.tw/10.1016/j.is.2019.101462
  • Yin, Y., Alqahtani, Y., Feng, J., & Chakraborty, J. (2021). Classification of eye tracking data in visual information processing tasks using convolutional neural networks and feature engineering. SN Computer Science, 2(2), 1–26. http://sci-hub.tw/10.1007/s42979-020-00444-0
  • Zhang, C., Yao, R., & Cai, J. (2018). Efficient eye typing with 9-direction gaze estimation. Multimedia Tools and Applications, 77(15), 19679–19696. sci-hub.tw/10.1007/s11042-017-5426-y
  • Zhang, X., Sugano, Y., Fritz, M., & Bulling, A. (2015). Appearance-based gaze estimation in the wild. Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 4511–4520. sci-hub.tw/10.1109/CVPR.2015.7299081
  • Zhang, X., Sugano, Y., Fritz, M., & Bulling, A. (2017). It’s written all over your face: Full-face appearance-based gaze estimation. IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 51–60. sci-hub.tw/10.1109/CVPRW.2017.284
  • Zhou, X., Cai, H., Shao, Z., Yu, H., & Liu, H. (2016). 3D eye model-based gaze estimation from a depth sensor. 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), December 3-7, 369–374.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.