Exploring Visual Languages for Prototy** Interactive Behaviors for Tangible Virtual Reality

  • Conference paper
  • First Online:
End-User Development (IS-EUD 2023)

Abstract

We explore potentials and limitations of visual programming environments to prototype Tangible Virtual Reality interactive behaviors, a technically complex task that requires the integration of real-time tracking hardware as well as software to program what happens in the virtual world given the position and orientation of physical objects. We created a plugin and an ad-hoc library to ease the integration of tracking hardware in the Unreal Blueprints visual environment and facilitated a one-day contextual inquiry workshop with six designers and researchers (with textual programming expertise) programming interactive behaviors using our library. Observations and contextual interviews with participants involved in two design activities uncovered areas of development for future visual end-user tools: provide different layers of abstraction, embrace liveness, foster in situ immersive programming and enable the use of interactive machine learning to program behaviors via users’ physical demonstration.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
EUR 29.95
Price includes VAT (France)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
EUR 53.49
Price includes VAT (France)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
EUR 68.56
Price includes VAT (France)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free ship** worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://www.unrealengine.com/es-ES/.

  2. 2.

    https://assetstore.unity.com/packages/tools/visual-scripting/bolt-163802.

  3. 3.

    https://antilatency.com/.

  4. 4.

    A “pawn” typically refers to a basic, low-level game object (Entity) that can be controlled or manipulated by a player or AI.

  5. 5.

    The TangibleVR plugin and the XRoom library are available at https://drive.google.com/drive/folders/19md1j7xzApyo1CAOCZ8-J9-brO7VhDWS.

  6. 6.

    https://developers.google.com/blockly?hl=es-419.

References

  1. Arora, J., Saini, A., Mehra, N., Jain, V., Shrey, S., Parnami, A.: Virtualbricks: exploring a scalable, modular toolkit for enabling physical manipulation in VR. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–12 (2019)

    Google Scholar 

  2. Bellucci, A., Jacucci, G., Kotkavuori, V., Serim, B., Ahmed, I., Ylirisku, S.: Extreme co-design: prototy** with and by the user for appropriation of web-connected tags. In: Díaz, P., Pipek, V., Ardito, C., Jensen, C., Aedo, I., Boden, A. (eds.) IS-EUD 2015. LNCS, vol. 9083, pp. 109–124. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-18425-8_8

    Chapter  Google Scholar 

  3. Bellucci, A., Zarraonandia, T., Díaz, P., Aedo, I.: End-user prototy** of cross-reality environments. In: Proceedings of the Eleventh International Conference on Tangible, Embedded, and Embodied Interaction, pp. 173–182 (2017)

    Google Scholar 

  4. Beyer, H., Holtzblatt, K.: Contextual design. Interactions 6(1), 32–42 (1999)

    Article  Google Scholar 

  5. Borowski, M., Larsen-Ledet, I.: Lessons learned from using reprogrammable prototypes with end-user developers. In: Fogli, D., Tetteroo, D., Barricelli, B.R., Borsci, S., Markopoulos, P., Papadopoulos, G.A. (eds.) IS-EUD 2021. LNCS, vol. 12724, pp. 136–152. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-79840-6_9

    Chapter  Google Scholar 

  6. Carney, M., et al.: Teachable machine: approachable web-based tool for exploring machine learning classification. In: Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–8 (2020)

    Google Scholar 

  7. Chang, J.S.K., et al.: TASC: combining virtual reality with tangible and embodied interactions to support spatial cognition. In: Proceedings of the 2017 Conference on Designing Interactive Systems, pp. 1239–1251 (2017)

    Google Scholar 

  8. Chen, M., Peljhan, M., Sra, M.: Entanglevr: a visual programming interface for virtual reality interactive scene generation. In: Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology, pp. 1–6 (2021)

    Google Scholar 

  9. Chu, E., Zaman, L.: Exploring alternatives with unreal engine’s blueprints visual scripting system. Entertain. Comput. 36, 100388 (2021)

    Article  Google Scholar 

  10. Feick, M., Bateman, S., Tang, A., Miede, A., Marquardt, N.: Tangi: tangible proxies for embodied object exploration and manipulation in virtual reality. In: 2020 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp. 195–206. IEEE (2020)

    Google Scholar 

  11. Fiani, B., De Stefano, F., Kondilis, A., Covarrubias, C., Reier, L., Sarhadi, K.: Virtual reality in neurosurgery: “can you see it?’’-a review of the current applications and future potential. World Neurosurg. 141, 291–298 (2020)

    Article  Google Scholar 

  12. Flanagan, J.C.: The critical incident technique. Psychol. Bull. 51(4), 327 (1954)

    Article  Google Scholar 

  13. Harley, D., Tarun, A.P., Germinario, D., Mazalek, A.: Tangible VR: diegetic tangible objects for virtual reality narratives. In: Proceedings of the 2017 Conference on Designing Interactive Systems, pp. 1253–1263 (2017)

    Google Scholar 

  14. Jordà, S., Geiger, G., Alonso, M., Kaltenbrunner, M.: The reactable: exploring the synergy between live music performance and tabletop tangible interfaces. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction, pp. 139–146 (2007)

    Google Scholar 

  15. König, W.A., Rädle, R., Reiterer, H.: Interactive design of multimodal user interfaces: reducing technical and visual complexity. J. Multimodal User Interfaces 3, 197–213 (2010)

    Article  Google Scholar 

  16. Krauß, V., Nebeling, M., Jasche, F., Boden, A.: Elements of XR prototy**: characterizing the role and use of prototypes in augmented and virtual reality design. In: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, pp. 1–18 (2022)

    Google Scholar 

  17. Kubitza, T., Schmidt, A.: meSchup: a platform for programming interconnected smart things. Computer 50(11), 38–49 (2017)

    Article  Google Scholar 

  18. Lekić, M., Gardašević, G.: IoT sensor integration to node-red platform. In: 2018 17th International Symposium Infoteh-Jahorina (Infoteh), pp. 1–5. IEEE (2018)

    Google Scholar 

  19. Muender, T., Reinschluessel, A.V., Drewes, S., Wenig, D., Döring, T., Malaka, R.: Does it feel real? Using tangibles with different fidelities to build and explore scenes in virtual reality. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1–12 (2019)

    Google Scholar 

  20. Myers, B., Park, S.Y., Nakano, Y., Mueller, G., Ko, A.: How designers design and program interactive behaviors. In: 2008 IEEE Symposium on Visual Languages and Human-Centric Computing, pp. 177–184. IEEE (2008)

    Google Scholar 

  21. Myers, B.A.: Taxonomies of visual programming and program visualization. J. Vis. Lang. Comput. 1(1), 97–123 (1990)

    Article  Google Scholar 

  22. Paternò, F., Santoro, C.: End-user development for personalizing applications, things, and robots. Int. J. Hum. Comput. Stud. 131, 120–130 (2019)

    Article  Google Scholar 

  23. Raffaillac, T., Huot, S.: What do researchers need when implementing novel interaction techniques? Proc. ACM Hum.-Comput. Interact. 6(EICS), 1–30 (2022)

    Google Scholar 

  24. Ramos, G., Meek, C., Simard, P., Suh, J., Ghorashi, S.: Interactive machine teaching: a human-centered approach to building machine-learned models. Hum.-Comput. Interact. 35(5–6), 413–451 (2020)

    Article  Google Scholar 

  25. Resnick, M., et al.: Scratch: programming for all. Commun. ACM 52(11), 60–67 (2009)

    Article  Google Scholar 

  26. Schiavoni, F.L., Gonçalves, L.L.: From virtual reality to digital arts with mosaicode. In: 2017 19th Symposium on Virtual and Augmented Reality (SVR), pp. 200–206. IEEE (2017)

    Google Scholar 

  27. Tanimoto, S.L.: A perspective on the evolution of live programming. In: 2013 1st International Workshop on Live Programming (LIVE), pp. 31–34. IEEE (2013)

    Google Scholar 

  28. Wang, T., et al.: Capturar: an augmented reality tool for authoring human-involved context-aware applications. In: Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, pp. 328–341 (2020)

    Google Scholar 

  29. Weintrop, D., et al.: Evaluating coblox: a comparative study of robotics programming environments for adult novices. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–12 (2018)

    Google Scholar 

  30. Yigitbas, E., Klauke, J., Gottschalk, S., Engels, G.: End-user development for interactive web-based virtual reality scenes. J. Comput. Lang. 74, 101187 (2023)

    Article  Google Scholar 

  31. Zhang, L., Oney, S.: Flowmatic: an immersive authoring tool for creating interactive scenes in virtual reality. In: Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology, pp. 342–353 (2020)

    Google Scholar 

Download references

Acknowledgments

This project has received funding from the Spanish State Research Agency (AEI) under grants Sense2MakeSense (PID2019-109388GB-I00) and form the Madrid Government (Comunidad de Madrid-Spain) under the Multiannual Agreement with UC3M in the line of Excellence of University Professors (EPUC3M17), and in the context of the V PRICIT (Regional Programme of Research and Technological Innovation).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrea Bellucci .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bellucci, A., Díaz, P., Aedo, I. (2023). Exploring Visual Languages for Prototy** Interactive Behaviors for Tangible Virtual Reality. In: Spano, L.D., Schmidt, A., Santoro, C., Stumpf, S. (eds) End-User Development. IS-EUD 2023. Lecture Notes in Computer Science, vol 13917. Springer, Cham. https://doi.org/10.1007/978-3-031-34433-6_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-34433-6_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-34432-9

  • Online ISBN: 978-3-031-34433-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

Navigation