Selection in Stride: Comparing Button- and Head-Based Augmented Reality Interaction During Locomotion

  • Conference paper
  • First Online:
HCI International 2024 Posters (HCII 2024)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 2116))

Included in the following conference series:

  • 69 Accesses

Abstract

Military users of augmented reality (AR) head-mounted displays must interact with their heads-up displays (HUDs) effectively and while on the move. Yet, there is a paucity of human-computer interaction (HCI) studies investigating AR multimodal interfaces (MMIs) and interaction methods during locomotion. We conducted a mixed methods study comparing stationary and ambulatory button- and head-based AR interaction methods. Utilizing a within-participants design, Soldier participants completed a simple task sequence in an AR HUD while walking on an omnidirectional treadmill and standing still using both a chest-mounted controller alone (C) and a head-gaze cursor with button input for selection (C + HG). Quantitative task performance analysis revealed faster time-on-task for the C + HG method when stationary. However, when walking, the C method generally surpassed the C + HG method. Careful analysis of selection and head-gaze hovering inputs reflected participants’ difficulty in stabilizing their head while walking which led to inaccuracies in menu icon selection and necessitated additional selection input. Moreover, several participants reported difficulty with stabilizing their head-gaze as well as greater preference for and better success using the C method to perform the task sequence while walking. Taken together, these findings support the idea that while head-gaze is a promising AR interaction method in relatively stationary contexts, the fact that it requires good head stability for reliable interaction negatively impacts task performance and user experience during locomotion. This study brings attention to the challenges of MMIs in ambulatory AR usage contexts and the need for more research in this area.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now
Chapter
EUR 29.95
Price includes VAT (France)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 128.39
Price includes VAT (France)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 89.66
Price includes VAT (France)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Disclosure of Interests

The authors have no competing interests to declare that are relevant to the content of this article.

References

  1. Soldier, P.E.O.: PEO Soldier | Portfolio - PM IVAS - Integrated Visual Augmentation System (IVAS). https://www.peosoldier.army.mil/Equipment/Equipment-Portfolio/Project-Manager-Soldier-Warrior-Portfolio/Integrated-Visual-Augmentation-System/

  2. Hornbæk, K., Mottelson, A., Knibbe, J., Vogel, D.: What do we mean by “interaction”? An analysis of 35 years of CHI. ACM Trans. Comput.-Hum. Interact. 26, 1–30 (2019). https://doi.org/10.1145/3325285

  3. Azofeifa, J.D., Noguez, J., Ruiz, S., Molina-Espinosa, J.M., Magana, A.J., Benes, B.: Systematic review of multimodal human-computer interaction. Informatics 9, 1–13 (2022). https://doi.org/10.3390/informatics9010013

  4. Lazaro, M.J., Lee, J., Chun, J., Yun, M.H., Kim, S.: Multimodal interaction: input-output modality combinations for identification tasks in augmented reality. Appl. Ergon. 105, 103842 (2022). https://doi.org/10.1016/j.apergo.2022.103842

    Article  Google Scholar 

  5. Turk, M.: Multimodal interaction: a review. Pattern Recogn. Lett. 36, 189–195 (2014). https://doi.org/10.1016/j.patrec.2013.07.003

    Article  Google Scholar 

  6. Ismail, A.W., Billinghurst, M., Sunar, M.S.: Vision-based technique and issues for multimodal interaction in augmented reality. In: Proceedings of the 8th International Symposium on Visual Information Communication and Interaction, pp. 75–82. Association for Computing Machinery, New York (2015)

    Google Scholar 

  7. Li, S., Yerebakan, M.O., Luo, Y., Amaba, B., Swope, W., Hu, B.: The effect of different occupational background noises on voice recognition accuracy. J. Comput. Inf. Sci. Eng. 22 (2022). https://doi.org/10.1115/1.4053521

  8. Wu, S., Li, Z., Li, S., Liu, Q., Wu, W.: An overview of gesture recognition. In: Varadarajan, V., Lin, J.C.-W., Lorenz, P. (eds.) International Conference on Computer Application and Information Security (ICCAIS 2022), p. 1260926 (2023)

    Google Scholar 

  9. Boring, S., Jurmu, M., Butz, A.: Scroll, tilt or move it: using mobile phones to continuously control pointers on large public displays. In: Proceedings of the 21st Annual Conference of the Australian Computer-Human Interaction Special Interest Group: Design: Open 24/7, pp. 161–168. Association for Computing Machinery, New York (2009)

    Google Scholar 

  10. Department of the Army: Rifle and carbine training circular (TC 3-22.9). Army Publishing Directorate, Washington, D.C. (2016)

    Google Scholar 

  11. Kapp, S., Barz, M., Mukhametov, S., Sonntag, D., Kuhn, J.: ARETT: augmented reality eye tracking toolkit for head mounted displays. Sensors 21, 2234 (2021). https://doi.org/10.3390/s21062234

  12. Gardony, A.L., Lindeman, R.W., Brunyé, T.T.: Eye-tracking for human-centered mixed reality: promises and challenges. In: Kress, B.C., Peroz, C. (eds.) Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR), p. 113100T. SPIE (2020)

    Google Scholar 

  13. Hirzle, T., Cordts, M., Rukzio, E., Bulling, A.: A survey of digital eye strain in gaze-based interactive systems. In: ACM Symposium on Eye Tracking Research and Applications. Association for Computing Machinery, New York (2020)

    Google Scholar 

  14. Callahan-Flintoft, C., Jensen, E., Naeem, J., Nonte, M.W., Madison, A.M., Ries, A.J.: A comparison of head movement classification methods. Sensors 24, 1260 (2024). https://doi.org/10.3390/s24041260

  15. Bates, R., Istance, H.O.: Why are eye mice unpopular? a detailed comparison of head and eye controlled assistive technology pointing devices. Univ. Access Inf. Soc. 2, 280–290 (2003). https://doi.org/10.1007/s10209-003-0053-y

    Article  Google Scholar 

  16. Bernardos, A.M., Gómez, D., Casar, J.R.: A comparison of head pose and deictic pointing interaction methods for smart environments. Int. J. Hum.-Comput. Interact. 32, 325–351 (2016). https://doi.org/10.1080/10447318.2016.1142054

    Article  Google Scholar 

  17. Jalaliniya, S., Mardanbeigi, D., Pederson, T., Hansen, D.W.: Head and eye movement as pointing modalities for eyewear computers. In: 2014 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops, pp. 50–53 (2014)

    Google Scholar 

  18. Hansen, J.P., Rajanna, V., MacKenzie, I.S., Bækgaard, P.: A Fitts’ law study of click and dwell interaction by gaze, head and mouse with a head-mounted display. In: Proceedings of the Workshop on Communication by Gaze Interaction. Association for Computing Machinery, New York (2018)

    Google Scholar 

  19. Kytö, M., Ens, B., Piumsomboon, T., Lee, G.A., Billinghurst, M.: Pinpointing: precise head- and eye-based target selection for augmented reality. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–14. Association for Computing Machinery, New York (2018)

    Google Scholar 

  20. Kuber, P.M., Rashedi, E.: Alterations in physical demands during virtual/augmented reality-based tasks: a systematic review. Ann. Biomed. Eng. 51, 1910–1932 (2023). https://doi.org/10.1007/s10439-023-03292-0

    Article  Google Scholar 

  21. Lages, W.S., Bowman, D.A.: Walking with adaptive augmented reality workspaces: design and usage patterns. In: Proceedings of the 24th International Conference on Intelligent User Interfaces, pp. 356–366. Association for Computing Machinery, New York (2019)

    Google Scholar 

  22. Müller, F., Schmitz, M., Schmitt, D., Günther, S., Funk, M., Mühlhäuser, M.: Walk the line: leveraging lateral shifts of the walking path as an input modality for head-mounted displays. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–15. Association for Computing Machinery, New York (2020)

    Google Scholar 

  23. Infinadeck: The Only True Omnidirectional Treadmill | VR Treadmill. https://www.infinadeck.com/

  24. Unity. https://unity.com/

  25. Brooke, J.: SUS: a quick and dirty usability scale. Usabil. Eval. Ind. 189, 4–7 (1995)

    Google Scholar 

  26. Hart, S.G., Staveland, L.E.: Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. In: Hancock, P.A., Meshkati, N. (eds.) Advances in Psychology, pp. 139–183. North-Holland (1988)

    Google Scholar 

  27. R Core Team: R: A Language and Environment for Statistical Computing (2022). https://www.r-project.org/

  28. Bates, D., Mächler, M., Bolker, B., Walker, S.: fitting linear mixed-effects models using lme4. J. Stat. Softw. 67, 1–48 (2015). https://doi.org/10.18637/jss.v067.i01

  29. Wickham, H.: Ggplot2: Elegant Graphics for Data Analysis. Springer-Verlag, New York (2016)

    Book  Google Scholar 

  30. Lenth, R.: emmeans: Estimated Marginal Means, aka Least-Squares Means (2023). https://cran.r-project.org/package=emmeans

  31. Kane, S.K., Wobbrock, J.O., Smith, I.E.: Getting off the treadmill: evaluating walking user interfaces for mobile devices in public spaces. In: Proceedings of the 10th International Conference on Human Computer Interaction with Mobile Devices and Services, pp. 109–118. ACM, Amsterdam (2008)

    Google Scholar 

  32. Esteves, A., Verweij, D., Suraiya, L., Islam, R., Lee, Y., Oakley, I.: SmoothMoves: smooth pursuits head movements for augmented reality. In: Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, pp. 167–178. ACM, Québec City (2017)

    Google Scholar 

  33. Shi, R., Zhu, N., Liang, H.-N., Zhao, S.: Exploring head-based mode-switching in virtual reality. In: 2021 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 118–127 (2021)

    Google Scholar 

  34. Kim, T., Ham, A., Ahn, S., Lee, G.: Lattice menu: a low-error gaze-based marking menu utilizing target-assisted gaze gestures on a lattice of visual anchors. In: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. Association for Computing Machinery, New York (2022)

    Google Scholar 

Download references

Acknowledgements

We thank Daniel Grover for assistance in develo** the walking speed data collection software. We also thank Jessica Armstrong for assistance with data collection.

Funding

This work was conducted by the DEVCOM SC Cognitive Science and Applications Branch and was supported by the Measuring and Advancing Soldier Tactical Readiness and Effectiveness (MASTR-E) Program and the Center for Applied Brain and Cognitive Sciences under a cooperative agreement (W911QY-19–2-0003) with Tufts University during the period of February 2022 to March 2024.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Aaron L. Gardony .

Editor information

Editors and Affiliations

Ethics declarations

The views expressed in this article are solely those of the authors and do not reflect the official policies or positions of the Department of Army, the Department of Defense, or any other department or agency of the U.S. Government. The primary author prepared this work as part of their official duties as an employee of the United States Government. Pursuant to Sect. 105 of the Copyright Act of 1976, this work is not entitled to domestic copyright protection under U.S. law. The citation of trade names in this report does not constitute official product endorsement or approval. The companies providing software and technology (Infinadeck and Pison Technology Inc.) to support this effort did not contribute to the preparation of this report.

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Gardony, A.L., Okano, K., Whitig, A.B., Smith, M. (2024). Selection in Stride: Comparing Button- and Head-Based Augmented Reality Interaction During Locomotion. In: Stephanidis, C., Antona, M., Ntoa, S., Salvendy, G. (eds) HCI International 2024 Posters. HCII 2024. Communications in Computer and Information Science, vol 2116. Springer, Cham. https://doi.org/10.1007/978-3-031-61950-2_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-61950-2_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-61949-6

  • Online ISBN: 978-3-031-61950-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation