Controlling Robots Using Gaze Estimation: A Systematic Bibliometric and Research Trend Analysis

Engelbert Harsandi Erik Suryadarma, Pringgo Widyo Laksono, Ilham Priadythama, Lobes Herdiman

Abstract


The rapid progression of technology and robotics has brought about a transformative revolution in various fields. From industrial automation to healthcare and beyond, robots have become integral parts of our society, such as using them to move laparoscopic cameras. Eye-gaze-based control in robotics is a cutting-edge innovation, providing enhanced human–robot interaction and control. However, current research is in the underexplored area of gaze-based control for robotics. This paper presents a systematic bibliometric analysis review of controlling robots using gaze estimation. The aim is to provide a research map overview of the use of eye gaze to control robots by clustering application areas based on ISIC-UN and several data acquisition technologies. Over the past 10 years, the number of publications in this field has been relatively stable, averaging 21.5 papers per year, with minimal fluctuations in annual article counts (σ = 4.9). This differs from research on robotics, which grows by an average of 1376 papers per year. Research on using eye gaze for robot control in the last 10 years in the field of human health and social work has only resulted in 17 articles; transportation and storage resulted in 12 articles; professional, scientific, and technical activities resulted in eight articles; information and communication resulted in five articles; and education and art resulted in two articles. Data acquisition technology for eye gaze research, primarily using a commercial eye tracker. Thus, there is significant potential for future research through the utilization of gaze estimation in various fields, as mentioned above.


Keywords


Gaze Estimation; Robot Control; Bibliometric Analysis; Human-Machine Collaboration.

Full Text:

PDF

References


A. S. Butt and M. Alghababsheh, “COVID-19 and distribution centres operations: The impacts and countermeasures,” Heliyon, vol. 9, no. 7, p. e18000, Jul. 2023, doi: 10.1016/j.heliyon.2023.e18000.

S. Schaupp, “COVID‐19, economic crises and digitalisation: How algorithmic management became an alternative to automation,” New Technology, Work and Employment, vol. 38, no. 2, pp. 311–329, May 2022, doi: 10.1111/ntwe.12246.

S. C. Anderson et al., “Quantifying the impact of COVID-19 control measures using a Bayesian model of physical distancing,” PLOS Computational Biology, vol. 16, no. 12, p. e1008274, Dec. 2020, doi: 10.1371/journal.pcbi.1008274.

M. Ardolino, A. Bacchetti, A. Dolgui, G. Franchini, D. Ivanov, and A. Nair, “The impacts of digital technologies on coping with the COVID-19 pandemic in the manufacturing industry: a systematic literature review,” International Journal of Production Research, vol. 62, no. 5, pp. 1953–1976, Oct. 2022, doi: 10.1080/00207543.2022.2127960.

C. Deniz and G. Gökmen, “A New Robotic Application for COVID-19 Specimen Collection Process,” Journal of Robotics and Control (JRC), vol. 3, no. 1, pp. 73–77, Oct. 2021, doi: 10.18196/jrc.v3i1.11659.

T. S. Znad, I. A. M. A. Sayed, S. S. Hameed, I. Al_Barazanchi, P. S. JosephNg, and A. L. Khalaf, “Internet of Things Wearable Healthcare System for Monitoring the Challenges of COVID-19 Pandemic,” Journal of Robotics and Control (JRC), vol. 3, no. 6, pp. 854–862, Mar. 2023, doi: 10.18196/jrc.v3i6.16393.

J. Gutzeit, L. Weller, F. Muth, J. Kürten, and L. Huestegge, “Eye Did This! Sense of Agency in Eye Movements,” Acta Psychologica, vol. 243, p. 104121, 2023, doi: 10.2139/ssrn.4431862.

B. Selaskowski et al., “Gaze-based attention refocusing training in virtual reality for adult attention-deficit/hyperactivity disorder,” BMC Psychiatry, vol. 23, no. 1, Jan. 2023, doi: 10.1186/s12888-023-04551-z.

T. W. Frazier et al., “Development of webcam‐collected and artificial‐intelligence‐derived social and cognitive performance measures for neurodevelopmental genetic syndromes,” American Journal of Medical Genetics Part C: Seminars in Medical Genetics, vol. 193, no. 3, Aug. 2023, doi: 10.1002/ajmg.c.32058.

Y. K. Meena, H. Cecotti, B. Bhushan, A. Dutta, and G. Prasad, “Detection of Dyslexic Children Using Machine Learning and Multimodal Hindi Language Eye-Gaze-Assisted Learning System,” IEEE Transactions on Human-Machine Systems, vol. 53, no. 1, pp. 122–131, Feb. 2023, doi: 10.1109/thms.2022.3221848.

X. Wang and V. J. Santos, “Gaze-Based Shared Autonomy Framework With Real-Time Action Primitive Recognition for Robot Manipulators,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 31, pp. 4306–4317, 2023, doi: 10.1109/tnsre.2023.3328888.

J. Stapel, M. El Hassnaoui, and R. Happee, “Measuring Driver Perception: Combining Eye-Tracking and Automated Road Scene Perception,” Human Factors: The Journal of the Human Factors and Ergonomics Society, vol. 64, no. 4, pp. 714–731, Sep. 2020, doi: 10.1177/0018720820959958.

E. Almanzor, F. Ye, J. Shi, T. G. Thuruthel, H. A. Wurdemann, and F. Iida, “Static Shape Control of Soft Continuum Robots Using Deep Visual Inverse Kinematic Models,” IEEE Transactions on Robotics, vol. 39, no. 4, 2023, doi: 10.1109/TRO.2023.3275375.

G. Riva and E. Riva, “DE-ENIGMA: Multimodal Human–Robot Interaction for Teaching and Expanding Social Imagination in Autistic Children,” Cyberpsychol. Behav. Soc. Netw., vol. 23, no. 11, 2020, doi: 10.1089/cyber.2020.29200.ceu.

J. Xiong, J. Chen, and P. S. Lee, “Functional Fibers and Fabrics for Soft Robotics, Wearables, and Human–Robot Interface,” Advanced Materials, vol. 33, no. 19. 2021. doi: 10.1002/adma.202002640.

B. Szabó, B. Őrsi, and C. Csukonyi, “Robots for surgeons? Surgeons for robots? Exploring the acceptance of robotic surgery in the light of attitudes and trust in robots,” BMC Psychology, vol. 12, no. 1, Jan. 2024, doi: 10.1186/s40359-024-01529-8.

D. N. Turgut et al., “Job satisfaction, depression severity and quality of life ratings of perioperative nurses in robotic-assisted and laparoscopic surgery,” Journal of Robotic Surgery, vol. 18, no. 1, Jan. 2024, doi: 10.1007/s11701-023-01764-y.

F. Vigo et al., “An interdisciplinary team-training protocol for robotic gynecologic surgery improves operating time and costs: analysis of a 4-year experience in a university hospital setting,” Journal of Robotic Surgery, vol. 16, no. 1, pp. 89–96, Feb. 2021, doi: 10.1007/s11701-021-01209-4.

Y. Shen and X. Zhang, “The impact of artificial intelligence on employment: the role of virtual agglomeration,” Humanities and Social Sciences Communications, vol. 11, no. 1, Jan. 2024, doi: 10.1057/s41599-024-02647-9.

G. Fabris, L. Scalera, and A. Gasparetto, “Playing Checkers with an Intelligent and Collaborative Robotic System,” Robotics, vol. 13, no. 1, p. 4, Dec. 2023, doi: 10.3390/robotics13010004.

H. Jin et al., “A Semi-Automatic Oriental Ink Painting Framework for Robotic Drawing From 3D Models,” IEEE Robotics and Automation Letters, vol. 8, no. 10, pp. 6667–6674, Oct. 2023, doi: 10.1109/lra.2023.3311364.

R. Wright, S. Parekh, R. White, and D. P. Losey, “Safely and autonomously cutting meat with a collaborative robot arm,” Scientific Reports, vol. 14, no. 1, Jan. 2024, doi: 10.1038/s41598-023-50569-4.

J. Sarmento, F. Neves dos Santos, A. Silva Aguiar, V. Filipe, and A. Valente, “Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower,” Journal of Intelligent & Robotic Systems, vol. 110, no. 1, Feb. 2024, doi: 10.1007/s10846-023-02037-4.

R. Gervasi, M. Capponi, L. Mastrogiacomo, and F. Franceschini, “Analyzing psychophysical state and cognitive performance in human-robot collaboration for repetitive assembly processes,” Production Engineering, vol. 18, no. 1, pp. 19–33, Oct. 2023, doi: 10.1007/s11740-023-01230-6.

S. Wang, J. Zhang, P. Wang, J. Law, R. Calinescu, and L. Mihaylova, “A deep learning-enhanced Digital Twin framework for improving safety and reliability in human–robot collaborative manufacturing,” Robotics and Computer-Integrated Manufacturing, vol. 85, p. 102608, Feb. 2024, doi: 10.1016/j.rcim.2023.102608.

F. Legler, J. Trezl, D. Langer, M. Bernhagen, A. Dettmann, and A. C. Bullinger, “Emotional Experience in Human–Robot Collaboration: Suitability of Virtual Reality Scenarios to Study Interactions beyond Safety Restrictions,” Robotics, vol. 12, no. 6, p. 168, Dec. 2023, doi: 10.3390/robotics12060168.

R. Singh, S. Mozaffari, M. Akhshik, M. J. Ahamed, S. Rondeau-Gagné, and S. Alirezaee, “Human–Robot Interaction Using Learning from Demonstrations and a Wearable Glove with Multiple Sensors,” Sensors, vol. 23, no. 24, p. 9780, Dec. 2023, doi: 10.3390/s23249780.

R. Yahyaabadi and S. Nikan, “An Explainable Attention Zone Estimation for Level 3 Autonomous Driving,” IEEE Access, vol. 11, pp. 93098–93110, 2023, doi: 10.1109/access.2023.3309810.

S. Jha, N. Al-Dhahir, and C. Busso, “Driver Visual Attention Estimation Using Head Pose and Eye Appearance Information,” IEEE Open Journal of Intelligent Transportation Systems, vol. 4, pp. 216–231, 2023, doi: 10.1109/ojits.2023.3258184.

N. Deo and M. M. Trivedi, “Looking at the Driver/Rider in Autonomous Vehicles to Predict Take-Over Readiness,” IEEE Transactions on Intelligent Vehicles, vol. 5, no. 1, pp. 41–52, Mar. 2020, doi: 10.1109/tiv.2019.2955364.

I. T. C. Hooge, G. A. Holleman, N. C. Haukes, and R. S. Hessels, “Gaze tracking accuracy in humans: One eye is sometimes better than two,” Behav. Res. Methods, vol. 51, no. 6, 2019, doi: 10.3758/s13428-018-1135-3.

Y. M. Guo et al., “A bibliometric analysis and visualization of blockchain,” Future Generation Computer Systems, vol. 116, 2021, doi: 10.1016/j.future.2020.10.023.

A. Vehlen, I. Spenthof, D. Tönsing, M. Heinrichs, and G. Domes, “Evaluation of an eye tracking setup for studying visual attention in face-to-face conversations,” Sci. Rep., vol. 11, no. 1, 2021, doi: 10.1038/s41598-021-81987-x.

H. Dervis, “Bibliometric analysis using bibliometrix an R package,” Journal of Scientometric Research, vol. 8, no. 3, 2019, doi: 10.5530/JSCIRES.8.3.32.

N. Donthu, S. Kumar, D. Mukherjee, N. Pandey, and W. M. Lim, “How to conduct a bibliometric analysis: An overview and guidelines,” J. Bus. Res., vol. 133, 2021, doi: 10.1016/j.jbusres.2021.04.070.

F. De Felice and A. Polimeni, “Coronavirus disease (COVID-19): A machine learning bibliometric analysis,” In Vivo, vol. 34. 2020, doi: 10.21873/invivo.11951.

A. Tandon, P. Kaur, M. Mäntymäki, and A. Dhir, “Blockchain applications in management: A bibliometric analysis and literature review,” Technol. Forecast Soc. Change, vol. 166, 2021, doi: 10.1016/j.techfore.2021.120649.

A. Janik, A. Ryszko, and M. Szafraniec, “Scientific landscape of smart and sustainable cities literature: A bibliometric analysis,” Sustainability, vol. 12, no. 3, 2020, doi: 10.3390/su12030779.

H. Nobanee et al., “A bibliometric analysis of sustainability and risk management,” Sustainability, vol. 13, no. 6. 2021, doi: 10.3390/su13063277.

R. Hunt, T. Blackmore, C. Mills, and M. Dicks, “Evaluating the integration of eye-tracking and motion capture technologies: Quantifying the accuracy and precision of gaze measures,” Iperception, vol. 13, no. 5, 2022, doi: 10.1177/20416695221116652.

S. Pastel, C. H. Chen, L. Martin, M. Naujoks, K. Petri, and K. Witte, “Comparison of gaze accuracy and precision in real-world and virtual reality,” Virtual Real, vol. 25, no. 1, 2021, doi: 10.1007/s10055-020-00449-3.

F. B. Narcizo, F. E. D. Dos Santos, and D. W. Hansen, “High-accuracy gaze estimation for interpolation-based eye-tracking methods,” Vision, vol. 5, no. 3, 2021, doi: 10.3390/vision5030041.

A. Batool, S. W. Loke, N. Fernando, and J. Kua, “Towards a Policy Development Methodology for Human-Centred IoT Collectives,” Sensors, vol. 22, no. 19, 2022, doi: 10.3390/s22197401.

A. Batool, S. W. Loke, N. Fernando, and J. Kua, “Towards a Policy Management Framework for Managing Interaction Behaviors in IoT Collectives,” Internet of Things, vol. 2, no. 4, 2021, doi: 10.3390/iot2040032.

J. Wirtz and V. Pitardi, “How intelligent automation, service robots, and AI will reshape service products and their delivery,” Italian Journal of Marketing, vol. 2023, no. 3, 2023, doi: 10.1007/s43039-023-00076-1.

K. Tsunetomo, K. Watanabe, and Y. Kishita, “Smart product-service systems design process for socially conscious digitalization,” J. Clean Prod., vol. 368, 2022, doi: 10.1016/j.jclepro.2022.133172.

J. Willems, L. Schmidthuber, D. Vogel, F. Ebinger, and D. Vanderelst, “Ethics of robotized public services: The role of robot design and its actions,” Gov. Inf. Q, vol. 39, no. 2, 2022, doi: 10.1016/j.giq.2022.101683.

D. Vanderelst and J. Willems, “Can We Agree on What Robots Should be Allowed to Do? An Exercise in Rule Selection for Ethical Care Robots,” Int. J. Soc. Robot, vol. 12, no. 5, 2020, doi: 10.1007/s12369-019-00612-0.

M. Westerlund, “An ethical framework for smart robots,” Technology Innovation Management Review, vol. 10, no. 1, 2020, doi: 10.22215/timreview/1312.

K. Pollmann, W. Loh, N. Fronemann, and D. Ziegler, “Entertainment vs. manipulation: Personalized human-robot interaction between user experience and ethical design,” Technol. Forecast Soc. Change, vol. 189, 2023, doi: 10.1016/j.techfore.2023.122376.

E. Bryndin, “About Formation of International Ethical Digital Environment with Smart Artificial Intelligence,” International Journal of Information Technology, Control and Automation, vol. 11, no. 1, 2021, doi: 10.5121/ijitca.2021.11101.

E. Kaasinen, A. H. Anttila, P. Heikkilä, J. Laarni, H. Koskinen, and A. Väätänen, “Smooth and Resilient Human–Machine Teamwork as an Industry 5.0 Design Challenge,” Sustainability, vol. 14, no. 5, 2022, doi: 10.3390/su14052773.

M. R. W. Hiebl, “Sample Selection in Systematic Literature Reviews of Management Research,” Organizational Research Methods, vol. 26, no. 2. 2023. doi: 10.1177/1094428120986851.

H. Taherdoost, “Towards Nuts and Bolts of Conducting Literature Review: A Typology of Literature Review,” Electronics, vol. 12, no. 4. 2023. doi: 10.3390/electronics12040800.

S. Kraus, M. Breier, and S. Dasí-Rodríguez, “The art of crafting a systematic literature review in entrepreneurship research,” International Entrepreneurship and Management Journal, vol. 16, no. 3, 2020, doi: 10.1007/s11365-020-00635-4.

D. Breslin and C. Gatrell, “Theorizing Through Literature Reviews: The Miner-Prospector Continuum,” Organ. Res. Methods, vol. 26, no. 1, 2023, doi: 10.1177/1094428120943288.

A. Carrera-Rivera, W. Ochoa, F. Larrinaga, and G. Lasa, “How-to conduct a systematic literature review: A quick guide for computer science research,” MethodsX, vol. 9, 2022, doi: 10.1016/j.mex.2022.101895.

S. Kraus et al., “Literature reviews as independent studies: guidelines for academic practice,” Review of Managerial Science, vol. 16, no. 8, 2022, doi: 10.1007/s11846-022-00588-8.

C. Navarro-Lopez, S. Linares-Mustaros, and C. Mulet-Forteza, “‘The Statistical Analysis of Compositional Data’ by John Aitchison (1986): A Bibliometric Overview,” SAGE Open, vol. 12, no. 2, p. 215824402210933, Apr. 2022, doi: 10.1177/21582440221093366.

T.-H. Chen and J.-L. Peng, “Statistical and bibliometric analysis of financial innovation,” Library Hi Tech., vol. 38, no. 2, pp. 308–319, Dec. 2019, doi: 10.1108/lht-09-2018-0140.

W. Chen, M. M. Ahmed, W. I. Sofiah, N. A. M. Isa, N. A. Ebrahim, and T. Hai, “A Bibliometric Statistical Analysis of the Fuzzy Inference System - based Classifiers,” IEEE Access, vol. 9, pp. 77811–77829, 2021, doi: 10.1109/access.2021.3082908.

I. D. Wangsa, I. Vanany, and N. Siswanto, “Issues in sustainable supply chain’s futuristic technologies: a bibliometric and research trend analysis,” Environmental Science and Pollution Research, vol. 29, no. 16, pp. 22885–22912, Jan. 2022, doi: 10.1007/s11356-021-17805-8.

Z. T. Beyene, S. P. Nadeem, M. E. Jaleta, and A. Kreie, “Research Trends in Dry Port Sustainability: A Bibliometric Analysis,” Sustainability, vol. 16, no. 1, p. 263, Dec. 2023, doi: 10.3390/su16010263.

A. Borji and L. Itti, “State-of-the-Art in Visual Attention Modeling,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, no. 1, pp. 185–207, Jan. 2013, doi: 10.1109/tpami.2012.89.

J. Ma, Y. Zhang, A. Cichocki, and F. Matsuno, “A Novel EOG/EEG Hybrid Human–Machine Interface Adopting Eye Movements and ERPs: Application to Robot Control,” IEEE Transactions on Biomedical Engineering, vol. 62, no. 3, pp. 876–889, Mar. 2015, doi: 10.1109/tbme.2014.2369483.

X. Chen, B. Zhao, Y. Wang, S. Xu, and X. Gao, “Control of a 7-DOF Robotic Arm System With an SSVEP-Based BCI,” International Journal of Neural Systems, vol. 28, no. 08, p. 1850018, Aug. 2018, doi: 10.1142/s0129065718500181.

X. Chen, B. Zhao, Y. Wang, and X. Gao, “Combination of high-frequency SSVEP-based BCI and computer vision for controlling a robotic arm,” Journal of Neural Engineering, vol. 16, no. 2, p. 026012, Jan. 2019, doi: 10.1088/1741-2552/aaf594.

J. P. Hansen, A. Alapetite, I. S. MacKenzie, and E. Møllenbach, “The use of gaze to control drones,” Proceedings of the Symposium on Eye Tracking Research and Applications, pp. 27-34, Mar. 2014, doi: 10.1145/2578153.2578156.

A. Zaraki, D. Mazzei, M. Giuliani, and D. De Rossi, “Designing and Evaluating a Social Gaze-Control System for a Humanoid Robot,” IEEE Transactions on Human-Machine Systems, vol. 44, no. 2, pp. 157–168, Apr. 2014, doi: 10.1109/thms.2014.2303083.

H. Admoni and S. Srinivasa, “Predicting user intent through eye gaze for shared autonomy,” in 2016 AAAI Fall Symposium Series, 2016.

R. M. Aronson, T. Santini, T. C. Kübler, E. Kasneci, S. Srinivasa, and H. Admoni, “Eye-Hand Behavior in Human-Robot Shared Manipulation,” Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pp. 4-13, 2018, doi: 10.1145/3171221.3171287.

K. Fujii, G. Gras, A. Salerno, and G.-Z. Yang, “Gaze gesture based human robot interaction for laparoscopic surgery,” Medical Image Analysis, vol. 44, pp. 196–214, Feb. 2018, doi: 10.1016/j.media.2017.11.011.

H. Zeng et al., “Closed-Loop Hybrid Gaze Brain-Machine Interface Based Robotic Arm Control with Augmented Reality Feedback,” Frontiers in Neurorobotics, vol. 11, Oct. 2017, doi: 10.3389/fnbot.2017.00060.

J. F. Perez-Lopez, “Appendix B. United Nations International Standard Industrial Classification of All Economic Activities,” Measuring Cuban Economic Performance, pp. 165–172, Dec. 1987, doi: 10.7560/750920-012.

J. Guo et al., “A Novel Robotic Guidance System With Eye-Gaze Tracking Control for Needle-Based Interventions,” IEEE Transactions on Cognitive and Developmental Systems, vol. 13, no. 1, pp. 179–188, Mar. 2021, doi: 10.1109/tcds.2019.2959071.

P. Li, X. Hou, X. Duan, H. Yip, G. Song, and Y. Liu, “Appearance-Based Gaze Estimator for Natural Interaction Control of Surgical Robots,” IEEE Access, vol. 7, pp. 25095–25110, 2019, doi: 10.1109/access.2019.2900424.

T. J. C. O. Vrielink, J. G.-B. Puyal, A. Kogkas, A. Darzi, and G. Mylonas, “Intuitive Gaze-Control of a Robotized Flexible Endoscope,” 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1776-1782, 2018, doi: 10.1109/iros.2018.8594426.

P. Li, X. Hou, L. Wei, G. Song, and X. Duan, “Efficient and Low-Cost Deep-Learning Based Gaze Estimator for Surgical Robot Control,” 2018 IEEE International Conference on Real-time Computing and Robotics (RCAR), pp. 58-63, 2018, doi: 10.1109/rcar.2018.8621810.

A. A. Kogkas, A. Darzi, and G. P. Mylonas, “Gaze-contingent perceptually enabled interactions in the operating theatre,” International Journal of Computer Assisted Radiology and Surgery, vol. 12, no. 7, pp. 1131–1140, Apr. 2017, doi: 10.1007/s11548-017-1580-y.

H. M. Yip, D. Navarro-Alarcon, and Y. Liu, “Development of an eye-gaze controlled interface for surgical manipulators using eye-tracking glasses,” 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1900-1905, 2016, doi: 10.1109/robio.2016.7866606.

J. Kawata, J. Morimoto, Y. Kaji, M. Higuchi, and S. Fujisawa, “Development of a Gaze-Driven Electric Wheelchair with 360° Camera and Novel Gaze Interface,” Journal of Robotics and Mechatronics, vol. 35, no. 3, pp. 743–750, Jun. 2023, doi: 10.20965/jrm.2023.p0743.

J. M. Araujo, G. Zhang, J. P. P. Hansen, and S. Puthusserypady, “Exploring Eye-Gaze Wheelchair Control,” ACM Symposium on Eye Tracking Research and Applications, pp. 1-8, 2020, doi: 10.1145/3379157.3388933.

D. Cojocaru, L. F. Manta, I. C. Vladu, A. Dragomir, and A. M. Mariniuc, “Using an Eye Gaze New Combined Approach to Control a Wheelchair Movement,” 2019 23rd International Conference on System Theory, Control and Computing (ICSTCC), pp. 626-631, 2019, doi: 10.1109/icstcc.2019.8886158.

M. Subramanian, N. Songur, D. Adjei, P. Orlov, and A. A. Faisal, “A. Eye Drive: Gaze-based semi-autonomous wheelchair interface,” 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 5967-5970, 2019, doi: 10.1109/embc.2019.8856608.

P. Arora, A. Sharma, A. S. Soni, and A. Garg, “Control of wheelchair dummy for differently abled patients via iris movement using image processing in MATLAB,” 2015 Annual IEEE India Conference (INDICON), pp. 1-4, 2015, doi: 10.1109/indicon.2015.7443610.

V. Krishna Sharma, K. Saluja, V. Mollyn, and P. Biswas, “Eye Gaze Controlled Robotic Arm for Persons with Severe Speech and Motor Impairment,” ACM Symposium on Eye Tracking Research and Applications, pp. 1-9, 2020, doi: 10.1145/3379155.3391324.

Y.-S. L.-K. Cio, M. Raison, C. Leblond Menard, and S. Achiche, “Proof of Concept of an Assistive Robotic Arm Control Using Artificial Stereovision and Eye-Tracking,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 27, no. 12, pp. 2344–2352, Dec. 2019, doi: 10.1109/tnsre.2019.2950619.

G. Zhang and J. P. Hansen, “People with Motor Disabilities Using Gaze to Control Telerobots,” Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1-9, 2020, doi: 10.1145/3334480.3382939.

G. Zhang, J. P. Hansen, and K. Minakata, “Hand- and gaze-control of telepresence robots,” Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, pp. 1-8, 2019, doi: 10.1145/3317956.3318149.

S. Ban, Y. J. Lee, K. J. Yu, J. W. Chang, J.-H. Kim, and W.-H. Yeo, “Persistent Human–Machine Interfaces for Robotic Arm Control Via Gaze and Eye Direction Tracking,” Advanced Intelligent Systems, vol. 5, no. 7, Jul. 2023, doi: 10.1002/aisy.202370028.

M. S. Amri bin Suhaimi, K. Matsushita, T. Kitamura, P. W. Laksono, and M. Sasaki, “Object Grasp Control of a 3D Robot Arm by Combining EOG Gaze Estimation and Camera-Based Object Recognition,” Biomimetics, vol. 8, no. 2, p. 208, May 2023, doi: 10.3390/biomimetics8020208.

L. Wöhle and M. Gebhard, “Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface,” Sensors, vol. 21, no. 5, p. 1798, Mar. 2021, doi: 10.3390/s21051798.

Z. Fan, C. Lin, and C. Fu, “A Gaze Signal Based Control Method for Supernumerary Robotic Limbs,” 2020 3rd International Conference on Control and Robots (ICCR), pp. 107-111, 2020, doi: 10.1109/iccr51572.2020.9344272.

H. Zeng et al., “Semi-Autonomous Robotic Arm Reaching With Hybrid Gaze–Brain Machine Interface,” Frontiers in Neurorobotics, vol. 13, Jan. 2020, doi: 10.3389/fnbot.2019.00111.

S. Li, J. Webb, X. Zhang, and C. A. Nelson, “User evaluation of a novel eye-based control modality for robot-assisted object retrieval,” Advanced Robotics, vol. 31, no. 7, pp. 382–393, Apr. 2017, doi: 10.1080/01691864.2016.1271748.

S. Li, X. Zhang, and J. D. Webb, “3-D-Gaze-Based Robotic Grasping Through Mimicking Human Visuomotor Function for People With Motion Impairments,” IEEE Transactions on Biomedical Engineering, vol. 64, no. 12, pp. 2824–2835, Dec. 2017, doi: 10.1109/tbme.2017.2677902.

H. Wang, X. Dong, Z. Chen, and B. E. Shi, “Hybrid gaze/EEG brain computer interface for robot arm control on a pick and place task,” 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 1476-1479, 2015, doi: 10.1109/embc.2015.7318649.

A. Matsuzaka, L. Yang, C. Guo, T. Shirato, and A. Namiki, “Assistance for Master-Slave System for Objects of Various Shapes by Eye Gaze Tracking and Motion Prediction,” 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 1953-1958, 2018, doi: 10.1109/robio.2018.8664898.

H. Zeng, Y. Shen, D. Sun, X. Hu, P. Wen, and A. Song, “Extended Control With Hybrid Gaze-BCI for Multi-Robot System Under Hands-Occupied Dual-Tasking,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 31, pp. 829–840, 2023, doi: 10.1109/tnsre.2023.3234971.

Dirik, Castillo, and Kocamaz, “Gaze-Guided Control of an Autonomous Mobile Robot Using Type-2 Fuzzy Logic,” Applied System Innovation, vol. 2, no. 2, p. 14, Apr. 2019, doi: 10.3390/asi2020014.

R. Guo, Y. Lin, X. Luo, X. Gao, and S. Zhang, “A robotic arm control system with simultaneous and sequential modes combining eye-tracking with steady-state visual evoked potential in virtual reality environment,” Frontiers in Neurorobotics, vol. 17, Mar. 2023, doi: 10.3389/fnbot.2023.1146415.

A. Faura-Pujol, M. Faundez-Zanuy, A. Moral-Viñals, and J. López-Xarbau, “Eye-Tracking Calibration to Control a Cobot,” International Journal of Computational Methods and Experimental Measurements, vol. 11, no. 1, pp. 17–25, Mar. 2023, doi: 10.18280/ijcmem.110103.

M. Faundez-Zanuy, A. Moral-Viñals, And J. Lopez-Xarbau, “Use of Eye-Tracking to Control A Cobot Movement: Results Coming From The Application of Tobii Pro X2-30 to An Omron TM5-700 Robot,” Safety and Security Engineering IX, vol. 206, p. 241, 2021, doi: 10.2495/safe210201.

G. Zhang and J. P. Hansen, “Accessible control of telepresence robots based on eye tracking,” Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, pp. 1-3, 2019, doi: 10.1145/3314111.3322869.

E. Lambe and P. Cuffe, “Design and Test of a Headset Activation Platform Controlled by Eye Gaze Tracking,” IECON 2019 - 45th Annual Conference of the IEEE Industrial Electronics Society, vol. 1, pp. 5364-5369, 2019, doi: 10.1109/iecon.2019.8927716.

G. Bales and Z. Kong, “Cognitive correlates of EEG spectral power indicate human-swarm task performance,” Proceedings of the 8th International Conference on the Internet of Things, pp. 1-6, 2018, doi: 10.1145/3277593.3277613.

M. Rusydi, M. Sasaki, and S. Ito, “Affine Transform to Reform Pixel Coordinates of EOG Signals for Controlling Robot Manipulators Using Gaze Motions,” Sensors, vol. 14, no. 6, pp. 10107–10123, Jun. 2014, doi: 10.3390/s140610107.

G. Zhang, J. P. Hansen, K. Minakata, A. Alapetite, and Z. Wang, “Eye-Gaze-Controlled Telepresence Robots for People with Motor Disabilities,” 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 574-575, 2019, doi: 10.1109/hri.2019.8673093.

G. Zhang and J. P. Hansen, “A Virtual Reality Simulator for Training Gaze Control of Wheeled Tele-Robots,” 25th ACM Symposium on Virtual Reality Software and Technology, pp. 1-2, 2019, doi: 10.1145/3359996.3364707.

J. P. Hansen, A. Alapetite, M. Thomsen, Z. Wang, K. Minakata, and G. Zhang, “Head and gaze control of a telepresence robot with an HMD,” Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, pp. 1-3, 2018, doi: 10.1145/3204493.3208330.

H. El-Hussieny, S. F. M. Assal, and J.-H. Ryu, “SoTCM: a scene-oriented task complexity metric for gaze-supported teleoperation tasks,” Intelligent Service Robotics, vol. 11, no. 3, pp. 279–288, May 2018, doi: 10.1007/s11370-018-0253-1.

D. Gego, C. Carreto, and L. Figueiredo, “Teleoperation of a mobile robot based on eye-gaze tracking,” 2017 12th Iberian Conference on Information Systems and Technologies (CISTI), pp. 1-6, 2017, doi: 10.23919/cisti.2017.7975673.

J. L. Castellanos-Cruz, M. F. Gómez-Medina, M. Tavakoli, P. M. Pilarski, and K. Adams, “Comparison of Attentive and Explicit Eye Gaze Interfaces for Controlling Haptic Guidance of a Robotic Controller,” Journal of Medical Robotics Research, vol. 4, no. 3, p. 1950005, Sep. 2019, doi: 10.1142/s2424905x19500053.

F. Ahmed, M. S. Siddique Rayhan, S. Rahman, N. Benazir, A. E. Chowdhury, and M. Al Imran, “Controlling Multimedia Player With Eye Gaze Using Webcam,” 2019 International Conference on Robotics, Electrical and Signal Processing Techniques (ICREST), pp. 152-156, 2019, doi: 10.1109/icrest.2019.8644103.




DOI: https://doi.org/10.18196/jrc.v5i3.21686

Refbacks

  • There are currently no refbacks.


Copyright (c) 2024 Engelbert Harsandi Erik Suryadarma, Pringgo Widyo Laksono, Ilham Priadythama

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

 


Journal of Robotics and Control (JRC)

P-ISSN: 2715-5056 || E-ISSN: 2715-5072
Organized by Peneliti Teknologi Teknik Indonesia
Published by Universitas Muhammadiyah Yogyakarta in collaboration with Peneliti Teknologi Teknik Indonesia, Indonesia and the Department of Electrical Engineering
Website: http://journal.umy.ac.id/index.php/jrc
Email: jrcofumy@gmail.com


Kuliah Teknik Elektro Terbaik