The Effect of Eye Shape and the Use of Corrective Glasses on the Spatial Accuracy of Eye-Gaze-Based Robot Control with a Static Head Pose
DOI:
https://doi.org/10.18196/jrc.v6i4.26229Keywords:
Eye Shape, Static Head Pose, Gaze Estimation Algorithms, Robot Control, Human-Robot Interaction, Inclusive DesignAbstract
The integration of eye-gaze technology into robotic control systems has shown considerable promise in enhancing human–robot interaction, particularly for individuals with physical disabilities. This study investigates the influence of eye morphology and the use of corrective eyewear on the spatial accuracy of gaze-based robot control under static head pose conditions. Experiments were conducted using advanced eye-tracking systems and multiple machine learning algorithms—decision tree, support vector machine, discriminant analysis, naïve bayes, and K-nearest neighbor—on a participant pool with varied eye shapes and eyewear usage. The experimental design accounted for potential sources of bias, including lighting variability, participant fatigue, and calibration procedures. Statistical analyses revealed no significant differences in gaze estimation accuracy across eye shapes or eyewear status. However, a consistent pattern emerged: participants with non-monolid eye shapes achieved, on average, approximately 1% higher accuracy than those with monolid eye shapes—a difference that, while statistically insignificant, warrants further exploration. The findings suggest that gaze-based robotic control systems can operate reliably across diverse user groups and hold strong potential for use in assistive technologies targeting individuals with limited mobility, including those with severe motor impairments such as head paralysis. To further enhance the inclusiveness and robustness of such systems, future research should explore additional anatomical variations and environmental conditions that may influence gaze estimation accuracy.
References
G. Daemmer, S. Gablenz, R. Neumann, and Z. Major, “Design, topology optimization, and additive manufacturing of a pneumatically actuated lightweight robot,” Actuators, vol. 12, no. 7, p. 266, 2023, doi: 10.3390/act12070266.
Z. Li and S. Li, “Time-optimal constrained kinematic control of robotic manipulators by recurrent neural network,” Expert Systems with Applications, vol. 257, p. 124994, 2024, doi: 10.1016/j.eswa.2024.124994.
J. Qu, Z. Yu, W. Tang, Y. Xu, B. Mao, and K. Zhou, “Advanced technologies and applications of robotic soft grippers,” Advanced Materials Technologies, vol. 9, no. 11, p. 2301004, 2024, doi: 10.1002/admt.202301004.
A. J. Fuge, C. W. Herron, B. C. Beiter, B. Kalita, and A. Leonessa, “Design, development, and analysis of the lower body of next-generation 3D-printed humanoid research platform: PANDORA,” Robotica, vol. 41, no. 7, pp. 2177–2206, Jul. 2023, doi: 10.1017/S0263574723000395.
S. J. Oh, “Emergence of a new sector via a business ecosystem: a case study of Universal Robots and the collaborative robotics sector,” Technology Analysis and Strategic Management, vol. 35, no. 6, pp. 645–658, 2023, doi: 10.1080/09537325.2021.1986212.
M. Soori, R. Dastres, B. Arezoo, and F. K. G. Jough, “Intelligent robotic systems in Industry 4.0: A review,” Journal of Advanced Manufacturing Science and Technology, vol. 4, no. 3, 2024, doi: 10.51393/j.jamst.2024007.
J. Iqbal, Z. H. Khan, and A. Khalid, “Prospects of robotics in food industry,” Food Science and Technology (Brazil), vol. 37, no. 2, pp. 159–165, 2017, doi: 10.1590/1678-457X.14616.
M. Javaid, A. Haleem, R. P. Singh, and R. Suman, “Substantial capabilities of robotics in enhancing industry 4.0 implementation,” Cognitive Robotics, vol. 1, pp. 58–75, 2021, doi: 10.1016/j.cogr.2021.06.001.
T. T. Thuong and V. T. Ha, “Adaptive Control For Mobile Robots Based On Inteligent Controller,” Journal of Applied Science and Engineering, vol. 27, no. 5, pp. 2481–2487, 2023, doi: 10.6180/jase.202405_27(05).0012.
M. Choe, Y. Choi, J. Park, and J. Kim, “Head Mounted IMU-Based Driver’s Gaze Zone Estimation Using Machine Learning Algorithm,” International Journal of Human–Computer Interaction, vol. 40, no. 23, pp. 7970–7981, Dec. 2024, doi: 10.1080/10447318.2023.2276520.
F. B. Narcizo, F. E. D. dos Santos, and D. W. Hansen, “High-Accuracy Gaze Estimation for Interpolation-Based Eye-Tracking Methods,” Vision, vol. 5, no. 3, p. 41, Sep. 2021, doi: 10.3390/vision5030041.
J. Guo et al., “A Novel Robotic Guidance System With Eye-Gaze Tracking Control for Needle-Based Interventions,” IEEE Transactions on Cognitive and Developmental Systems, vol. 13, no. 1, pp. 179–188, Mar. 2021, doi: 10.1109/TCDS.2019.2959071.
L. Wöhle and M. Gebhard, “Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface,” Sensors, vol. 21, no. 5, p. 1798, Mar. 2021, doi: 10.3390/s21051798.
J. Meyer, S. Gering, and E. Kasneci, “Static Laser Feedback Interferometry-Based Gaze Estimation for Wearable Glasses,” IEEE Sensors Journal, vol. 23, no. 7, pp. 7558–7569, Apr. 2023, doi: 10.1109/JSEN.2023.3250714.
F. Kano, Y. Kawaguchi, and Y. Hanling, “Experimental evidence that uniformly white sclera enhances the visibility of eye-gaze direction in humans and chimpanzees,” eLife, vol. 11, Mar. 2022, doi: 10.7554/eLife.74086.
J. Xu, Z. Huang, L. Liu, X. Li, and K. Wei, “Eye-Gaze Controlled Wheelchair Based on Deep Learning,” Sensors, vol. 23, no. 13, p. 6239, Jul. 2023, doi: 10.3390/s23136239.
V. Krishna Sharma, K. Saluja, V. Mollyn, and P. Biswas, “Eye Gaze Controlled Robotic Arm for Persons with Severe Speech and Motor Impairment,” in ACM Symposium on Eye Tracking Research and Applications, pp. 1–9, Jun. 2020, doi: 10.1145/3379155.3391324.
D. Cojocaru, L. F. Manta, I. C. Vladu, A. Dragomir, and A. M. Mariniuc, “Using an Eye Gaze New Combined Approach to Control a Wheelchair Movement,” in 2019 23rd International Conference on System Theory, Control and Computing (ICSTCC), pp. 626–631, Oct. 2019, doi: 10.1109/ICSTCC.2019.8886158.
S. Ban, Y. J. Lee, K. J. Yu, J. W. Chang, J.-H. Kim, and W.-H. Yeo, “Persistent Human–Machine Interfaces for Robotic Arm Control Via Gaze and Eye Direction Tracking,” Advanced Intelligent Systems, vol. 5, no. 7, Jul. 2023, doi: 10.1002/aisy.202200408.
L. Maule et al., “RoboEye, an Efficient, Reliable and Safe Semi-Autonomous Gaze Driven Wheelchair for Domestic Use,” Technologies, vol. 9, no. 1, p. 16, Feb. 2021, doi: 10.3390/technologies9010016.
L. Wang and S. Li, “Wheelchair-Centered Omnidirectional Gaze-Point Estimation in the Wild,” IEEE Transactions on Human-Machine Systems, vol. 53, no. 3, pp. 466–478, Jun. 2023, doi: 10.1109/THMS.2023.3263541.
X. Zhang et al., “Design and Evaluation of the Extended FBS Model Based Gaze-Control Power Wheelchair for Individuals Facing Manual Control Challenges,” Sensors, vol. 23, no. 12, p. 5571, Jun. 2023, doi: 10.3390/s23125571.
I. Svensson et al., “Effects of assistive technology for students with reading and writing disabilities,” Disability and Rehabilitation: Assistive Technology, vol. 16, no. 2, pp. 196–208, Feb. 2021, doi: 10.1080/17483107.2019.1646821.
J. M. Fernández-Batanero, M. Montenegro-Rueda, J. Fernández-Cerero, and I. García-Martínez, “Assistive technology for the inclusion of students with disabilities: a systematic review,” Educational technology research and development, vol. 70, no. 5, pp. 1911–1930, Oct. 2022, doi: 10.1007/s11423-022-10127-7.
G. Yenduri et al., “From Assistive Technologies to Metaverse—Technologies in Inclusive Higher Education for Students With Specific Learning Difficulties: A Review,” IEEE Access, vol. 11, pp. 64907–64927, 2023, doi: 10.1109/ACCESS.2023.3289496.
X.-F. Yuan, Y.-Q. Ji, T.-X. Zhang, H.-B. Xiang, Z.-Y. Ye, and Q. Ye, “Effects of Exercise Habits and Gender on Sports e-Learning Behavior: Evidence from an Eye-Tracking Study,” Psychology Research and Behavior Management, vol. 17, pp. 813–826, Feb. 2024, doi: 10.2147/PRBM.S442863.
P. W. Laksono et al., “Mapping Three Electromyography Signals Generated by Human Elbow and Shoulder Movements to Two Degree of Freedom Upper-Limb Robot Control,” Robotics, vol. 9, no. 4, p. 83, Oct. 2020, doi: 10.3390/robotics9040083.
X. Wang and V. J. Santos, “Gaze-Based Shared Autonomy Framework With Real-Time Action Primitive Recognition for Robot Manipulators,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 31, pp. 4306–4317, 2023, doi: 10.1109/TNSRE.2023.3328888.
A. Fischer-Janzen, T. M. Wendt, and K. Van Laerhoven, “A scoping review of gaze and eye tracking-based control methods for assistive robotic arms,” Frontiers in Robotics and AI, vol. 11, no. 2, Feb. 2024, doi: 10.3389/frobt.2024.1326670.
D. Kang and J. Heo, “Content-Aware Eye Tracking for Autostereoscopic 3D Display,” Sensors, vol. 20, no. 17, p. 4787, Aug. 2020, doi: 10.3390/s20174787.
Y. Lei, S. He, M. Khamis, and J. Ye, “An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile Devices,” ACM Computing Surveys, vol. 56, no. 2, pp. 1–38, Feb. 2024, doi: 10.1145/3606947.
M. S. A. bin Suhaimi, K. Matsushita, T. Kitamura, P. W. Laksono, and M. Sasaki, “Object Grasp Control of a 3D Robot Arm by Combining EOG Gaze Estimation and Camera-Based Object Recognition,” Biomimetics, vol. 8, no. 2, p. 208, May 2023, doi: 10.3390/biomimetics8020208.
E. H. E. Suryadarma, P. W. Laksono, I. Priadythama, and L. Herdiman, “Controlling Robots Using Gaze Estimation: A Systematic Bibliometric and Research Trend Analysis,” Journal of Robotics and Control (JRC), vol. 5, no. 3, pp. 786–803, 2024, doi: 10.18196/jrc.v5i3.21686.
R. Hunt, T. Blackmore, C. Mills, and M. Dicks, “Evaluating the integration of eye-tracking and motion capture technologies: Quantifying the accuracy and precision of gaze measures,” i-Perception, vol. 13, no. 5, 2022, doi: 10.1177/20416695221116652.
J. M. Araujo, G. Zhang, J. P. P. Hansen, and S. Puthusserypady, “Exploring Eye-Gaze Wheelchair Control,” 2020, doi: 10.1145/3379157.3388933.
P. Pathirana, S. Senarath, D. Meedeniya, and S. Jayarathna, “Eye gaze estimation: A survey on deep learning-based approaches,” Expert Systems with Applications, vol. 199, p. 116894, Aug. 2022, doi: 10.1016/j.eswa.2022.116894.
A. A. Akinyelu and P. Blignaut, “Convolutional Neural Network-Based Methods for Eye Gaze Estimation: A Survey,” IEEE Access, vol. 8, pp. 142581–142605, 2020, doi: 10.1109/ACCESS.2020.3013540.
Á. Miklósi and M. Gácsi, “On the Utilization of Social Animals as a Model for Social Robotics,” Frontiers in Psychology, vol. 3, no. 3, 2012, doi: 10.3389/fpsyg.2012.00075.
L. Almeida, P. Menezes, and J. Dias, “Telepresence Social Robotics towards Co-Presence: A Review,” Applied Sciences, vol. 12, no. 11, p. 5557, May 2022, doi: 10.3390/app12115557.
A. Sharkey and N. Sharkey, “We need to talk about deception in social robotics,” Ethics and Information Technology, vol. 23, no. 3, pp. 309–316, Sep. 2021, doi: 10.1007/s10676-020-09573-9.
M. Capasso, “Responsible Social Robotics and the Dilemma of Control,” International Journal of Social Robotics, vol. 15, no. 12, pp. 1981–1991, Dec. 2023, doi: 10.1007/s12369-023-01049-2.
R. Martinez-Roig, M. Cazorla, and J. M. Esteve Faubel, “Social robotics in music education: A systematic review,” Frontiers in Education, vol. 8, no. 3, Mar. 2023, doi: 10.3389/feduc.2023.1164506.
K. Youssef, S. Said, S. Alkork, and T. Beyrouthy, “A Survey on Recent Advances in Social Robotics,” Robotics, vol. 11, no. 4, p. 75, Jul. 2022, doi: 10.3390/robotics11040075.
N. Akalin and A. Loutfi, “Reinforcement Learning Approaches in Social Robotics,” Sensors, vol. 21, no. 4, p. 1292, Feb. 2021, doi: 10.3390/s21041292.
G. Zhang, J. P. Hansen, and K. Minakata, “Hand- and gaze-control of telepresence robots,” in Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, pp. 1–8, Jun. 2019, doi: 10.1145/3317956.3318149.
I. T. C. Hooge, G. A. Holleman, N. C. Haukes, and R. S. Hessels, “Gaze tracking accuracy in humans: One eye is sometimes better than two,” Behavior Research Methods, vol. 51, no. 6, pp. 2712–2721, Dec. 2019, doi: 10.3758/s13428-018-1135-3.
E. Lambe and P. Cuffe, “Design and Test of a Headset Activation Platform Controlled by Eye Gaze Tracking,” in IECON 2019 - 45th Annual Conference of the IEEE Industrial Electronics Society, pp. 5364–5369, Oct. 2019, doi: 10.1109/IECON.2019.8927716.
V. V and S. R. Gopal, “Applications of Robotics in Healthcare,” International Journal of Scientific Research in Engineering and Management (IJSREM), vol. 06, no. 06, Jun. 2022, doi: 10.55041/IJSREM14231.
C. Elendu et al., “Ethical implications of AI and robotics in healthcare: A review,” Medicine, vol. 102, no. 50, p. e36671, Dec. 2023, doi: 10.1097/MD.0000000000036671.
E. Kamani, D. E. Kalisz, and A. Szyran-Resiak, “Patients’ behavioral intentions toward robotic adoption in healthcare: An approach on apprehension of embedding robotics,” Journal of General Management, vol. 48, no. 4, pp. 370–385, Jul. 2023, doi: 10.1177/03063070221080008.
D. Silvera-Tawil, “Robotics in Healthcare: A Survey,” SN Computer Science, vol. 5, no. 1, p. 189, 2024, doi: 10.1007/s42979-023-02551-0.
M. Solberg and R. Kirchhoff, “Media Representations of Healthcare Robotics in Norway 2000-2020: A Topic Modeling Approach,” Social Science Computer Review, vol. 42, no. 3, pp. 636–660, Jun. 2024, doi: 10.1177/08944393231212251.
G. Mois and J. M. Beer, “The Role of Healthcare Robotics in Providing Support to Older Adults: a Socio-ecological Perspective,” Current Geriatrics Reports, vol. 9, no. 2, pp. 82–89, Mar. 2020, doi: 10.1007/s13670-020-00314-w.
S. Breuer, M. Braun, D. Tigard, A. Buyx, and R. Müller, “How Engineers’ Imaginaries of Healthcare Shape Design and User Engagement: A Case Study of a Robotics Initiative for Geriatric Healthcare AI Applications,” ACM Transactions on Computer-Human Interaction, vol. 30, no. 2, pp. 1–33, Apr. 2023, doi: 10.1145/3577010.
C. Anthes, R. J. Garcia-Hernandez, M. Wiedemann, and D. Kranzlmuller, “State of the art of virtual reality technology,” in 2016 IEEE Aerospace Conference, pp. 1–19, Mar. 2016, doi: 10.1109/AERO.2016.7500674.
S. Pastel, C.-H. Chen, L. Martin, M. Naujoks, K. Petri, and K. Witte, “Comparison of gaze accuracy and precision in real-world and virtual reality,” Virtual Reality, vol. 25, no. 1, pp. 175–189, Mar. 2021, doi: 10.1007/s10055-020-00449-3.
G. Zhang and J. P. Hansen, “A Virtual Reality Simulator for Training Gaze Control of Wheeled Tele-Robots,” in 25th ACM Symposium on Virtual Reality Software and Technology, pp. 1–2, Nov. 2019, doi: 10.1145/3359996.3364707.
R. Guo, Y. Lin, X. Luo, X. Gao, and S. Zhang, “A robotic arm control system with simultaneous and sequential modes combining eye-tracking with steady-state visual evoked potential in virtual reality environment,” Frontiers in Neurorobotics, vol. 17, Mar. 2023, doi: 10.3389/fnbot.2023.1146415.
A. Matsuzaka, L. Yang, C. Guo, T. Shirato, and A. Namiki, “Assistance for Master-Slave System for Objects of Various Shapes by Eye Gaze Tracking and Motion Prediction,” in 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1953–1958, Dec. 2018, doi: 10.1109/ROBIO.2018.8664898.
Y. F. Wu and E. Y. Kim, “Users’ Perceptions of Technological Features in Augmented Reality (AR) and Virtual Reality (VR) in Fashion Retailing: A Qualitative Content Analysis,” Mobile Information Systems, vol. 2022, pp. 1–13, Aug. 2022, doi: 10.1155/2022/3080280.
D. Kanschik, R. R. Bruno, G. Wolff, M. Kelm, and C. Jung, “Virtual and augmented reality in intensive care medicine: a systematic review,” Annals of Intensive Care, vol. 13, no. 1, p. 81, Sep. 2023, doi: 10.1186/s13613-023-01176-z.
S. Kim, S. Lee, H. Kang, S. Kim, and M. Ahn, “P300 Brain–Computer Interface-Based Drone Control in Virtual and Augmented Reality,” Sensors, vol. 21, no. 17, p. 5765, Aug. 2021, doi: 10.3390/s21175765.
N. Dzyuba, J. Jandu, J. Yates, and E. Kushnerev, “Virtual and augmented reality in dental education: The good, the bad and the better,” European Journal of Dental Education, Nov. 2022, doi: 10.1111/eje.12871.
Y.-C. Lin et al., “Combining augmented and virtual reality simulation training to improve geriatric oral care performance in healthcare assistants: A randomized controlled trial,” DIGITAL HEALTH, vol. 9, Jan. 2023, doi: 10.1177/20552076231203891.
P. Bhattacharya et al., “Coalition of 6G and Blockchain in AR/VR Space: Challenges and Future Directions,” IEEE Access, vol. 9, pp. 168455–168484, 2021, doi: 10.1109/ACCESS.2021.3136860.
F. Ke, R. Liu, Z. Sokolikj, I. Dahlstrom-Hakki, and M. Israel, “Using eye-tracking in education: review of empirical research and technology,” Educational technology research and development, vol. 72, no. 3, pp. 1383–1418, Jun. 2024, doi: 10.1007/s11423-024-10342-4.
J. Y. Kim and M. J. Kim, “Identifying customer preferences through the eye-tracking in travel websites focusing on neuromarketing,” Journal of Asian Architecture and Building Engineering, vol. 23, no. 2, pp. 515–527, Mar. 2024, doi: 10.1080/13467581.2023.2244566.
J. Linde-Domingo and B. Spitzer, “Geometry of visuospatial working memory information in miniature gaze patterns,” Nature Human Behaviour, vol. 8, no. 2, pp. 336–348, Dec. 2023, doi: 10.1038/s41562-023-01737-z.
L. R. Lidle and J. Schmitz, “Assessing Visual Avoidance of Faces During Real-Life Social Stress in Children with Social Anxiety Disorder: A Mobile Eye-Tracking Study,” Child Psychiatry & Human Development, vol. 55, no. 1, pp. 24–35, Feb. 2024, doi: 10.1007/s10578-022-01383-y.
T. Suslow, D. Hoepfel, A. Kersting, and C. M. Bodenschatz, “Depressive symptoms and visual attention to others’ eyes in healthy individuals,” BMC Psychiatry, vol. 24, no. 1, p. 184, Mar. 2024, doi: 10.1186/s12888-024-05633-2.
M. S. H. Sunny et al., “Eye-gaze control of a wheelchair mounted 6DOF assistive robot for activities of daily living,” Journal of NeuroEngineering and Rehabilitation, vol. 18, no. 1, p. 173, Dec. 2021, doi: 10.1186/s12984-021-00969-2.
A. McCurley and D. Nathan-Roberts, “Eye-Tracking Assistive Technology for Hands-Free Computer Navigation,” Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 65, no. 1, pp. 1275–1279, Sep. 2021, doi: 10.1177/1071181321651313.
M. Donmez, “A systematic literature review for the use of eye-tracking in special education,” Education and Information Technologies, vol. 28, no. 6, pp. 6515–6540, Jun. 2023, doi: 10.1007/s10639-022-11456-z.
C. Leblond-Menard and S. Achiche, “Non-Intrusive Real Time Eye Tracking Using Facial Alignment for Assistive Technologies,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 31, pp. 954–961, 2023, doi: 10.1109/TNSRE.2023.3236886.
H. O. Edughele, Y. Zhang, F. Muhammad-Sukki, Q.-T. Vien, H. Morris-Cafiero, and M. Opoku Agyeman, “Eye-Tracking Assistive Technologies for Individuals With Amyotrophic Lateral Sclerosis,” IEEE Access, vol. 10, pp. 41952–41972, 2022, doi: 10.1109/ACCESS.2022.3164075.
W. Rahmaniar, A. Ma’Arif, and T.-L. Lin, “Touchless Head-Control (THC): Head Gesture Recognition for Cursor and Orientation Control,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 30, pp. 1817–1828, 2022, doi: 10.1109/TNSRE.2022.3187472.
J. K. Muguro et al., “Development of Surface EMG Game Control Interface for Persons with Upper Limb Functional Impairments,” Signals, vol. 2, no. 4, pp. 834–851, Nov. 2021, doi: 10.3390/signals2040048.
T. Faibish, A. Kshirsagar, G. Hoffman, and Y. Edan, “Human Preferences for Robot Eye Gaze in Human-to-Robot Handovers,” International Journal of Social Robotics, vol. 14, no. 4, pp. 995–1012, 2022, doi: 10.1007/s12369-021-00836-z.
S. K. Paul, M. Nicolescu, and M. Nicolescu, “Enhancing Human–Robot Collaboration through a Multi-Module Interaction Framework with Sensor Fusion: Object Recognition, Verbal Communication, User of Interest Detection, Gesture and Gaze Recognition,” Sensors, vol. 23, no. 13, p. 5798, Jun. 2023, doi: 10.3390/s23135798.
R. T. Chadalavada, H. Andreasson, M. Schindler, R. Palm, and A. J. Lilienthal, “Bi-directional navigation intent communication using spatial augmented reality and eye-tracking glasses for improved safety in human–robot interaction,” Robotics and Computer-Integrated Manufacturing, vol. 61, p. 101830, Feb. 2020, doi: 10.1016/j.rcim.2019.101830.
F. Babel et al., “Small Talk with a Robot? The Impact of Dialog Content, Talk Initiative, and Gaze Behavior of a Social Robot on Trust, Acceptance, and Proximity,” International Journal of Social Robotics, vol. 13, no. 6, pp. 1485–1498, Sep. 2021, doi: 10.1007/s12369-020-00730-0.
J. Muguro, P. W. Laksono, Y. Sasatake, M. I. Rusydi, K. Matsushita, and M. Sasaki, “Feasibility of AR-VR use in autonomous cars for user engagements and its effects on posture and vigilance during transit,” Transactions on Transport Sciences, vol. 14, no. 1, pp. 4–13, Apr. 2023, doi: 10.5507/tots.2022.019.
J. K. Muguro, P. W. Laksono, Y. Sasatake, K. Matsushita, and M. Sasaki, “User Monitoring in Autonomous Driving System Using Gamified Task: A Case for VR/AR In-Car Gaming,” Multimodal Technologies and Interaction, vol. 5, no. 8, p. 40, Jul. 2021, doi: 10.3390/mti5080040.
Z. Zhan, J. Wu, H. Mei, Q. Wu, and P. S. W. Fong, “Individual difference on reading ability tested by eye-tracking: from perspective of gender,” Interactive Technology and Smart Education, vol. 17, no. 3, pp. 267–283, Mar. 2020, doi: 10.1108/ITSE-12-2019-0082.
J. Chen, P. Yu, C. Yao, L. Zhao, and Y. Qiao, “Eye detection and coarse localization of pupil for video-based eye tracking systems,” Expert Systems with Applications, vol. 236, p. 121316, Feb. 2024, doi: 10.1016/j.eswa.2023.121316.
W. Liu, Y. Cao, and R. W. Proctor, “How do app icon color and border shape influence visual search efficiency and user experience? Evidence from an eye-tracking study,” International Journal of Industrial Ergonomics, vol. 84, p. 103160, Jul. 2021, doi: 10.1016/j.ergon.2021.103160.
Z. Xun et al., “Eye behavior recognition of eye–computer interaction,” Multimedia Tools and Applications, vol. 83, no. 11, pp. 32655–32671, Sep. 2023, doi: 10.1007/s11042-023-16763-2.
A. Zafar, C. M. Calderon, A. M. Yeboah, K. Dalton, E. Irving, and E. Niechwiej-Szwedo, “Investigation of Camera-Free Eye-Tracking Glasses Compared to a Video-Based System,” Sensors, vol. 23, no. 18, p. 7753, Sep. 2023, doi: 10.3390/s23187753.
J. Lyu and F. Xu, “Towards Eyeglasses Refraction in Appearance-based Gaze Estimation,” in 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 693–702, Oct. 2023, doi: 10.1109/ISMAR59233.2023.00084.
J. W. Lee, C. W. Cho, K. Y. Shin, E. C. Lee, and K. R. Park, “3D gaze tracking method using Purkinje images on eye optical model and pupil,” Optics and Lasers in Engineering, vol. 50, no. 5, pp. 736–751, May 2012, doi: 10.1016/j.optlaseng.2011.12.001.
X. Xu, L. Yang, Y. Yan, and C. Li, “CMFS-Net: Common Mode Features Suppression Network for Gaze Estimation,” in Proceedings of the 2023 Workshop on Advanced Multimedia Computing for Smart Manufacturing and Engineering, pp. 25–29, Oct. 2023, doi: 10.1145/3606042.3616449.
M. Othman, T. Amaral, R. McNaney, J. D. Smeddinck, J. Vines, and P. Olivier, “CrowdEyes,” in Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services, pp. 1–13, Sep. 2017, doi: 10.1145/3098279.3098559.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Engelbert Harsandi Erik Suryadarma, Pringgo Widyo Laksono, Ilham Priadythama, Lobes Herdiman, Muhammad Syaiful Amri Bin Suhaimi

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).
This journal is based on the work at https://journal.umy.ac.id/index.php/jrc under license from Creative Commons Attribution-ShareAlike 4.0 International License. You are free to:
- Share – copy and redistribute the material in any medium or format.
- Adapt – remix, transform, and build upon the material for any purpose, even comercially.
The licensor cannot revoke these freedoms as long as you follow the license terms, which include the following:
- Attribution. You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.
- ShareAlike. If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original.
- No additional restrictions. You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.
• Creative Commons Attribution-ShareAlike (CC BY-SA)
JRC is licensed under an International License