Detection of Voluntary Eye Movement for Analysis About Eye Gaze Behaviour in Virtual Communication

  • Conference paper
  • First Online:
HCI International 2023 Posters (HCII 2023)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1832))

Included in the following conference series:

  • 1038 Accesses

Abstract

In this study, we aim to realize smoother communication between avatars in virtual space and discuss the method of eye-gaze interaction used for avatar communication. It is necessary for this purpose, a specific gaze movement detection algorithm, which is necessary for measuring characteristic eye movements, blinks, and pupil movements. We developed those characteristic movement counting methods using an HMD built-in eye tracking system. Most input devices used in current virtual reality and augmented reality are hand gestures, head tracking, and voice input, despite the HMD attachment type. Therefore, in order to use the eye expression as a hands-free input modality, we consider an eye gaze input interface that does not depend on the measurement accuracy of the measurement device. Previous eye gaze interfaces have a difficulty called as “Midas touch” problem, which is the trade-off between the input speed and input errors. Therefore, using the method that has been developed so far, which as an input channel using characteristic eye movement and voluntary blinks, it aims to realize an input method that does not hinder the acquisition of other meta information alike gestures of eyes. Moreover, based on involuntary characteristic eye movements unconsciously expressed by the user, such as movement of the gaze, we discuss a system that enables “expression tactics” in the virtual space by providing natural feedback of the avatar's emotional expression and movement patterns. As a first step, we report the result of measured eyeball movements face-to-face through experiments in order to extract features of gaze and blinking.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
EUR 29.95
Price includes VAT (France)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 96.29
Price includes VAT (France)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 126.59
Price includes VAT (France)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zeltzer, D.: Autonomy, interaction and, presence: presence 1(1), 127–132 (1992)

    Google Scholar 

  2. Bowman, D.A., Rhoton, C.J., Pinho, M.S.: Text input techniques for immersive virtual environments: anempirical comparison. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 46, no. 26, pp. 2154–2158 (2002)

    Google Scholar 

  3. Møllenbach, E., Hansen, J.P., Lillholm, M.: Eye movements in gaze interaction. J. Eye Movement Res. 6(2), 1–15 (2013)

    Google Scholar 

  4. Rosalind, W.: Picard: Affective Computing. MIT Press, Cambridge (1997)

    Google Scholar 

  5. Chen, S.-Y., Gao, L., Lai, Y., Rosin, P., **a, S.: Real-time 3D face reconstruction and gaze tracking for virtual reality. In: Presented at: IEEE Conference on Virtual Reality and 3D User Interfaces, Reutlingen, Germany, 18–22 March 2018

    Google Scholar 

  6. Ku, P.-S., Wu, T.-Y., Chen, M.Y.: EyeExpress: expanding hands-free input vocabulary using eye expressions. In: The 31st Annual ACM Symposium on User Interface Software and Technology Adjunct Proceedings. ACM (2018)

    Google Scholar 

  7. Figueiredo, P., Fonseca, M.J.: EyeLinks: a gaze-only click alternative for heterogeneous clickables. In: Proceedings of the 20th ACM International Conference on Multimodal Interaction, pp. 307–314 (2018)

    Google Scholar 

  8. Rajanna, V., Hansen, J.P.: Gaze ty** in virtual reality: impact of keyboard design, selection method, and motion. In: Proceedings of the Symposium on Eye-Tracking Research & Applications (ETRA 2018), no.15 (2018)

    Google Scholar 

  9. Ahuja, K., Islam, R., Parashar, V., Dey, K., Harrison, C., Goel, M.: EyeSpyVR: interactive eye sensing using off-the-shelf, smartphone-based VR headsets. In: Proceedings of ACM Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 2, no. 2, Article 57 (2018)

    Google Scholar 

  10. Sindhwani, S., Lutteroth, C., Weber, G.: ReType: quick text editing with keyboard and gaze. In: Proceedings of the CHI Conference on Human Factors in Computing Systems (2019, in press)

    Google Scholar 

  11. Jacob, R.J.K.: What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI 1990), pp. 11–18 (1990)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shogo Matsuno .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Matsuno, S. (2023). Detection of Voluntary Eye Movement for Analysis About Eye Gaze Behaviour in Virtual Communication. In: Stephanidis, C., Antona, M., Ntoa, S., Salvendy, G. (eds) HCI International 2023 Posters. HCII 2023. Communications in Computer and Information Science, vol 1832. Springer, Cham. https://doi.org/10.1007/978-3-031-35989-7_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35989-7_35

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35988-0

  • Online ISBN: 978-3-031-35989-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation