The Efficiency of Augmented Pointing with and Without Speech in a Collaborative Virtual Environment

  • Conference paper
  • First Online:
Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management (HCII 2023)

Abstract

Pointing is a ubiquitous part of human communication. However, pointing gestures to distal referents are often misunderstood systematically, which may limit the usefulness of pointing. We examined pointing-based communication in a collaborative virtual environment (CVE) to address three questions. First, we wanted to evaluate the potential of apparently natural but technically augmented pointing in CVEs, such as presenting a warped pointer for increased legibility or the ability to assume the pointer’s perspective. Second, we wanted to test whether technical improvements in pointing accuracy also facilitate communication if pointing is accompanied by speech. Third, we wanted to check whether pointing accuracy is correlated with the efficiency of communication involving pointing and speech. An experiment revealed that pointing-based communication has considerable potential to be enhanced in CVEs, albeit the specific augmentation procedure we employed did not improve pointing-based communication. Importantly, improvements in pointing accuracy also facilitated communication when speech was allowed. Thereby, speech reduced but could not rule out misunderstandings. Finally, even a small gain in pointing accuracy allowed participants to agree on a referent faster. In summary, the experiment suggests that augmented pointing may considerably improve interactions in CVEs. Moreover, speech cannot fully compensate misunderstandings of pointing gestures and relatively small differences in pointing accuracy affect the efficiency of communication of speech is allowed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
EUR 29.95
Price includes VAT (Germany)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 85.59
Price includes VAT (Germany)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 106.99
Price includes VAT (Germany)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    In the current paper, we examine misunderstandings or errors between pointers and observers without considering the individual contributions of both interlocutors. If we use terms such as “biased interpretations”, we refer to the mismatch between the pointer’s referent and the observer’s guess thereof without attributing the misunderstanding to either of the interlocutors.

  2. 2.

    Our algorithm differed from previously applied war** methods [11,16]. Unlike [16], it takes horizontal errors into account but does not consider the non-linearity of pointing extrapolation. Unlike [11], we used a parameter-free geometric model for simplicity.

References

  1. Bakdash, J.Z., Marusich, L.R.: Repeated measures correlation. Front. Psychol. 8(456), 1–13 (2017). https://doi.org/10.3389/fpsyg.2017.00456

    Article  Google Scholar 

  2. Bangerter, A.: Using pointing and describing to achieve joint focus of attention in dialogue. Psychol. Sci. 15(6), 415–419 (2004). https://doi.org/10.1111/j.0956-7976.2004.0069

    Article  Google Scholar 

  3. Bangerter, A., Oppenheimer, D.M.: Accuracy in detecting referents of pointing gestures unaccompanied by language. Gesture 6(1), 85–102 (2006). https://doi.org/10.1075/gest.6.1.05ban

    Article  Google Scholar 

  4. Butterworth, G., Itakura, S.: How the eyes, head and hand serve definite reference. Br. J. Dev. Psychol. 18(1), 25–50 (2000). https://doi.org/10.1348/026151000165553

    Article  Google Scholar 

  5. Herbort, O., Krause, L.-M., Kunde, W.: Perspective determines the production and interpretation of pointing gestures. Psychon. Bull. Rev. 28(2), 641–648 (2020). https://doi.org/10.3758/s13423-020-01823-7

    Article  Google Scholar 

  6. Herbort, O., Kunde, W.: Spatial (mis-)interpretation of pointing gestures to distal referents. J. Exp. Psychol.: Hum. Percept. Perform. 42(1), 78–89 (2016). https://doi.org/10.1037/xhp0000126

  7. Herbort, O., Kunde, W.: How to point and to interpret pointing gestures? Instructions can reduce pointer-observer misunderstandings. Psychol. Res. 82(2), 395–406 (2018). https://doi.org/10.1007/s00426-016-0824-8

    Article  Google Scholar 

  8. Krause, L.M., Herbort, O.: The observer’s perspective determines which cues are used when interpreting pointing gestures. J. Exp. Psychol.: Hum. Percept. Perform. 47(9), 1209–1225 (2021). https://doi.org/10.1037/xhp0000937

    Article  Google Scholar 

  9. Louwerse, M.M., Bangerter, A.: Focusing attention with deictic gestures and linguistic expressions. In: Proceedings of the 27th Annual Meeting of the Cognitive Science Society (2005)

    Google Scholar 

  10. Lücking, A., Pfeiffer, T., Rieser, H.: Pointing and reference reconsidered. J. Pragmat. 77, 56–79 (2015). https://doi.org/10.1016/j.pragma.2014.12.013

    Article  Google Scholar 

  11. Mayer, S., et al.: Improving humans’ ability to interpret deictic gestures in virtual reality. In: Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–14. Association for Computing Machinery, New York (2020). https://doi.org/10.1145/3313831.3376340

  12. Pechmann, T., Deutsch, W.: The development of verbal and nonverbal devices for reference. J. Exp. Child Psychol. 34(2), 330–341 (1982). https://doi.org/10.1016/0022-0965(82)90050-9

    Article  Google Scholar 

  13. R Core Team R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria (2022)

    Google Scholar 

  14. Singmann, H., Bolker, B., Westfall, J., Aust, F., Ben-Shachar, M.S.: afex: Analysis of factorial experiments (2022)

    Google Scholar 

  15. van der Sluis, I., Krahmer, E.: Generating multimodal references. Discourse Process. 44(3), 145–174 (2007). https://doi.org/10.1080/01638530701600755

    Article  Google Scholar 

  16. Sousa, M., dos Anjos, R.K., Mendes, D., Billinghurst, M., Jorge, J.: War** DEIXIS: distorting gestures to enhance collaboration. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–12. CHI 2019, Association for Computing Machinery, New York (2019). https://doi.org/10.1145/3290605.3300838

  17. Wnuczko, M., Kennedy, J.M.: Pivots for pointing: visually-monitored pointing has higher arm elevations than pointing blindfolded. J. Exp. Psychol.: Hum. Percept. Perform. 37(5), 1485–1491 (2011). https://doi.org/10.1037/a0024232

    Article  Google Scholar 

  18. Wong, N., Gutwin, C.: Where are you pointing? The accuracy of deictic pointing in CVEs. In: Proceedings of the ACM Conference on Human Factors in Computing Systems (CHI 2010), pp. 1029–1038 (2010). https://doi.org/10.1145/1753326.1753480

  19. Wong, N., Gutwin, C.: Support for deictic pointing in CVEs: still fragmented after all these years. In: Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing, pp. 1377–1387. ACM (2014). https://doi.org/10.1145/2531602.2531691

Download references

Acknowledgments

We thank Anne Hanfbauer and Stefanie Flepsen for their help with the data collection. This work was supported by the German Research Foundation DFG (Grants HE6710/5–1 and HE6710/6–1 to Oliver Herbort).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Oliver Herbort .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary file1 (PDF 50 kb)

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Herbort, O., Krause, LM. (2023). The Efficiency of Augmented Pointing with and Without Speech in a Collaborative Virtual Environment. In: Duffy, V.G. (eds) Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management. HCII 2023. Lecture Notes in Computer Science, vol 14028. Springer, Cham. https://doi.org/10.1007/978-3-031-35741-1_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-35741-1_37

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-35740-4

  • Online ISBN: 978-3-031-35741-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation