Abstract
In this study, we aim to realize smoother communication between avatars in virtual space and discuss the method of eye-gaze interaction used for avatar communication. It is necessary for this purpose, a specific gaze movement detection algorithm, which is necessary for measuring characteristic eye movements, blinks, and pupil movements. We developed those characteristic movement counting methods using an HMD built-in eye tracking system. Most input devices used in current virtual reality and augmented reality are hand gestures, head tracking, and voice input, despite the HMD attachment type. Therefore, in order to use the eye expression as a hands-free input modality, we consider an eye gaze input interface that does not depend on the measurement accuracy of the measurement device. Previous eye gaze interfaces have a difficulty called as “Midas touch” problem, which is the trade-off between the input speed and input errors. Therefore, using the method that has been developed so far, which as an input channel using characteristic eye movement and voluntary blinks, it aims to realize an input method that does not hinder the acquisition of other meta information alike gestures of eyes. Moreover, based on involuntary characteristic eye movements unconsciously expressed by the user, such as movement of the gaze, we discuss a system that enables “expression tactics” in the virtual space by providing natural feedback of the avatar's emotional expression and movement patterns. As a first step, we report the result of measured eyeball movements face-to-face through experiments in order to extract features of gaze and blinking.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Zeltzer, D.: Autonomy, interaction and, presence: presence 1(1), 127–132 (1992)
Bowman, D.A., Rhoton, C.J., Pinho, M.S.: Text input techniques for immersive virtual environments: anempirical comparison. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 46, no. 26, pp. 2154–2158 (2002)
Møllenbach, E., Hansen, J.P., Lillholm, M.: Eye movements in gaze interaction. J. Eye Movement Res. 6(2), 1–15 (2013)
Rosalind, W.: Picard: Affective Computing. MIT Press, Cambridge (1997)
Chen, S.-Y., Gao, L., Lai, Y., Rosin, P., **a, S.: Real-time 3D face reconstruction and gaze tracking for virtual reality. In: Presented at: IEEE Conference on Virtual Reality and 3D User Interfaces, Reutlingen, Germany, 18–22 March 2018
Ku, P.-S., Wu, T.-Y., Chen, M.Y.: EyeExpress: expanding hands-free input vocabulary using eye expressions. In: The 31st Annual ACM Symposium on User Interface Software and Technology Adjunct Proceedings. ACM (2018)
Figueiredo, P., Fonseca, M.J.: EyeLinks: a gaze-only click alternative for heterogeneous clickables. In: Proceedings of the 20th ACM International Conference on Multimodal Interaction, pp. 307–314 (2018)
Rajanna, V., Hansen, J.P.: Gaze ty** in virtual reality: impact of keyboard design, selection method, and motion. In: Proceedings of the Symposium on Eye-Tracking Research & Applications (ETRA 2018), no.15 (2018)
Ahuja, K., Islam, R., Parashar, V., Dey, K., Harrison, C., Goel, M.: EyeSpyVR: interactive eye sensing using off-the-shelf, smartphone-based VR headsets. In: Proceedings of ACM Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 2, no. 2, Article 57 (2018)
Sindhwani, S., Lutteroth, C., Weber, G.: ReType: quick text editing with keyboard and gaze. In: Proceedings of the CHI Conference on Human Factors in Computing Systems (2019, in press)
Jacob, R.J.K.: What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1990), pp. 11–18 (1990)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Matsuno, S. (2023). Detection of Voluntary Eye Movement for Analysis About Eye Gaze Behaviour in Virtual Communication. In: Stephanidis, C., Antona, M., Ntoa, S., Salvendy, G. (eds) HCI International 2023 Posters. HCII 2023. Communications in Computer and Information Science, vol 1832. Springer, Cham. https://doi.org/10.1007/978-3-031-35989-7_35
Download citation
DOI: https://doi.org/10.1007/978-3-031-35989-7_35
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-35988-0
Online ISBN: 978-3-031-35989-7
eBook Packages: Computer ScienceComputer Science (R0)