Abstract
Purpose
Eye-tracking technology can be used for interaction functions involving the manipulation of various types of web content by replacing the functions of conventional input devices, such as mouse and keyboard. However, eye-tracking technology can also cause serious errors in pointing and executing pointed objects when using a web browser. Owing to the small size of executable objects, pointer execution errors cause the unintended execution of objects and difficulty in pointing.
Methods
We propose an interface that solves these two types of pointer execution errors for people with disabled upper limbs to control a graphical user interface. The proposed interface dynamically magnifies the tracked execution objects and executes objects pointed at by eye-tracking using voice commands. We develop a Chrome extension for this interface and use the interface by embedding the program in the browser.
Results
We verify the effect of this interface by conducting experiments on the error reductions of pointer execution and on proper magnification ratio identification. The results show that the eye-voice interface lowered the error rate of pointer execution and proved effective.
Conclusion
We confirmed that an eye-tracking interface combined with voice commands is effective in executing menus and execution objects in a web browser.
Similar content being viewed by others
Data availability
Applicable.
Code availability
Applicable.
References
Park, J.H., Park, S., Lee, M., Lim, S.B.: Voice activity detection algorithm using wavelet band entropy ensemble analysis in car noisy environments. J. Korea Multimed. Soc. 21(11), 1342–1352 (2018)
**, D.H.: A study on Usability Evaluation of an Eye Mouse-Based on the Function of Click. In: Double-Click, Drag, and Scroll Master’s Thesis, pp. 1–16. Seoul National University, Seoul (2016)
Murata A, Uetsugi R, Hayami T (2012) Study on cursor shape suitable for eye-gaze input system. In: 2012 Proceedings of SICE Annual Conference (SICE). pp 926–931.
Choi J (2017) A digital publishing framework for crowdsourcing based adaptive eBook contents PhD thesis. Sookmyung Women’s University.
Menges R, Kumar C, Müller D, Sengupta K. (2017) GazeTheWeb: a Gaze-Controlled Web Browser. In: Proceedings of the 14th Web for All Conference on the future of accessible work. pp 25–42
Chetcuti A, Porter C (2016) Butterfleye: Supporting the Development of Accessible Web Applications for Users with Severe Motor-Impairment. In: Proceedings of the 30th International BCS Human Computer Interaction Conference.
Kwak, S., Kim, I., Sim, D., Lee, S.H., Hwang, S.S.: A computer access system for the physically disabled using eye-tracking and speech recognition. J. HCI Soc. Korea 12(4), 5–15 (2017)
Python. https://www.python.org. Accessed (24 Mar 2018)
tobii_research API. http://developer.tobiipro.com. Accessed (24 Mar 2018)
Google Cloud Speech API. https://cloud.google.com/speech-to-text/?hl=ko. Accessed (24 Mar 24 2018)
Tobii Etyetracket X130. https://www.tobiipro.com. Accessed (24 Mar 2018)
Window API. https://docs.microsoft.com/en-us/windows/win32/apiindex/api-index-portal. Accessed (24 Mar 2018)
Acknowledgements
This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (NRF-2018R1A4A1025559).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflicts of interest
On behalf of all authors, the corresponding author states that there are no conflicts of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Lim, SB., Park, J.H. Development of an eye-tracking and voice command interface to facilitate GUI operation for people with disabled upper limbs. Univ Access Inf Soc 23, 329–343 (2024). https://doi.org/10.1007/s10209-022-00939-y
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10209-022-00939-y