Abstract
This chapter provides an overview of eXtended Reality (XR), a term that encompasses technologies such as Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR). XR allows the real world to be enriched with virtual data, objects, and information, offering varying levels of immersion and interaction. The chapter explores the components of XR and discusses classification frameworks, including Milgram’s Reality-Virtuality Continuum and the 3I model. It further delves into the developments and technologies of VR, highlighting stereoscopic vision, VR headsets and immersive walls. The chapter also explores AR, its accessibility through everyday devices, and its applications. The importance of considering all sensory modalities for creating immersive XR experiences is emphasized, and the role of multimodal interaction in enhancing user experiences is discussed. The chapter highlights the significance of a multisensory approach in achieving a sense of presence and explores examples of early immersive and multisensory technology. Finally, it discusses the impact of touch in human interaction and the incorporation of tactile sensations in VR applications, as well as of olfaction, which plays a significant role in human perception and memory, as certain smells can evoke strong emotional responses and trigger vivid recollections of past experiences. Incorporating odors into XR applications can enhance the overall sense of presence and realism by providing users with scent cues that align with the virtual environment. Olfactory displays are devices that release specific odors or scents to accompany virtual content.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Milgram P, Takemura H, Utsumi A, Kishino F (1995) Augmented reality: a class of displays on the reality-virtuality continuum. In: Proceedings of SPIE 2351, telemanipulator and telepresence technologies
Burdea G, Coiffet P (2003) Virtual reality technology, 2nd edn. Wiley-Interscience
Moton Heilig. http://www.mortonheilig.com. Last accessed 15 May 2023
Sutherland IE (1968) A head-mounted three dimensional display. In: AFIPS ‘68 (Fall, part I)
Oculus. https://www.oculus.com. Last accessed 15 May 2023
HTC Vive. https://www.vive.com. Last accessed 15 May 2023
Banks MS, Read JC, Allison RS, Watt SJ (2012) Stereoscopy and the human visual system. SMPTE Motion Imaging J 121(4):24–43
Cruz-Neira C, Sandin DJ, DeFanti TA, Kenyon RV, Hart JC (1992) The CAVE: audio visual experience automatic virtual environment. Commun ACM 35(6):64–72
Epson Moverio. https://moverio.epson.com. Last accessed 15 May 2023
Vuzix. https://www.vuzix.com. Last accessed 15 May 2023
Snapchat. https://www.snapchat.com. Last accessed 15 May 2023
Carulli M, Bordegoni M (2019) Multisensory augmented reality experiences for cultural heritage exhibitions. In: Rizzi C, Andrisano A, Leali F, Gherardini F, Pini F, Vergnano A (eds) Design tools and methods in industrial engineering. Lecture notes in mechanical engineering. Springer
Masoni R, Ferrise F, Bordegoni M, Gattullo M, Uva AE, Fiorentino M (2017) Supporting remote maintenance in industry 4.0 through augmented reality. Proc Manuf 11:1296–1302
Microsoft Hololens. https://www.microsoft.com/en-us/hololens. Last accessed 15 May 2023
Witmer BG, Singer MJ (1998) Measuring presence in virtual environments: a presence questionnaire. Presence Teleoperators Virtual Environ 7(3):225–240
Steuer J (1992) Defining virtual reality: dimensions determining telepresence. J Commun 42(4):73–93
Burnett S (2011) Perceptual worlds and sensory ecology. Nat Educ Knowl 3(10):75
Bourguet ML (2003) Designing and prototy** multimodal commands. In: Proceedings of human-computer interaction (INTERACT’03), pp 717–720
Marr D (1982) Vision: a computational investigation into the human representation and processing of visual information. Henry Holt and Co Inc., New York, NY
Shams L, Seitz AR (2008) Benefits of multisensory learning. Trends Cogn Sci 12(11):411–417
Spence C, Driver J (2004) Crossmodal space and crossmodal attention. Oxford University Press
Bordegoni M, Ferrise F (2013) Designing interaction with consumer products in a multisensory virtual reality environment. Virtual Phys Prototy** 8(1):51–64
Witmer BG, Singer MJ (1998) Measuring presence in virtual environments: a presence questionnaire. Presence 7(3):225–240
Heilig ML (1962) Sensorama simulator, US PAT. 3,050,870
Marto A, Melo M, Gonçalves A, Bessa M (2020) Multisensory augmented reality in cultural heritage: impact of different stimuli on presence, enjoyment, knowledge and value of the experience. IEEE Access 8:193744–193756
Field TM (1995) Touch in early development. Lawrence Erlbaum Associates, Inc.
Gallace A, Spence C (2013) In touch with the future. Oxford University Press
Rolls ET, O’Doherty J, Kringelbach ML, Francis S, Bowtell R, McGlone F (2003) Representations of pleasant and painful touch in the human orbitofrontal and cingulate cortices. Cereb Cortex 13(3):308–317
Etzi R, Gallace A (2016) The arousing power of everyday materials: an analysis of the physiological and behavioral responses to visually and tactually presented textures. Exp Brain Res 234(6):1659–1666
Robles-De-La-Torre G (2006) The importance of the sense of touch in virtual and real environments. IEEE Multim 13(3), Special issue on Haptic User Interfaces for Multimedia Systems, 24–30
Argonne National Lab. https://www.anl.gov. Last accessed 15 May 2023
Phantom device. https://www.3dsystems.com/haptics-devices/3d-systems-phantom-premium. Last accessed 15 May 2023
CyberGrasp. http://www.cyberglovesystems.com/cybergrasp. Last accessed 15 May 2023
Pacchierotti C, Sinclair S, Solazzi M, Frisoli A, Hayward V et al (2017) Wearable haptic systems for the fingertip and the hand: taxonomy, review, and perspectives. IEEE Trans Haptics (ToH) 10(4):580–600
Ultraleap. https://www.ultraleap.com/haptics/. Last accessed 15 May 2023
Ni5vrglove. https://hi5vrglove.com. Last accessed 15 May 2023
HaprX. https://haptx.com. Last accessed 15 May 2023
Teslasuit. https://teslasuit.io. Last accessed 15 May 2023
Ultrahaptics. https://www.ultraleap.com/haptics/. Last accessed 15 May 2023
Ferrise F, Bordegoni M, Lizaranzu J (2010) Product design review application based on a vision-sound-haptic interface, haptic and audio interaction design conference. Lecture notes in computer science (LNCS), vol 6306/2010. Springer, pp 169–178
Ferrise F, Bordegoni M, Lizaranzu J (2011) Use of interactive virtual prototypes to define product design specifications: a pilot study on consumer product. In: Proceedings of IEEE-ISVRI, Singapore
Nakamoto T (2013) Human olfactory displays and interfaces: odor sensing and presentation. Inf Sci Ref
Porcherot C, Delplanque S, Raviot-Derrien S, Le Calvé B, Chrea C, Gaudreau N, Cayeux I (2010) How do you feel when you smell this? Optimization of a verbal measurement of odor-elicited emotions. Food Qual Prefer 21:938–947
Rétiveau AN, Chambers E IV, Milliken GA (2004) Common and specific effects of fine fragrances on the mood of women. J Sens Stud 19:373–394
Corbin A (1986) The foul and the fragrant: Odour and the French social imagination. Harvard University Press
Spangenberg ER, Crowley AE (1996) Improving the store environment: do olfactory cues affect evaluations and behaviors? J Mark 60(2):67–80
Bosmans A (2006) Scents and sensibility: when do (in)congruent ambient scents influence product evaluations? J Mark 70(3):32–43
Gatti E, Bordegoni M, Spence C (2014) Investigating the influence of colour, weight, and fragrance intensity on the perception of liquid bath soap: an experimental study. Food Qual Prefer 31:56–64
Demattè ML, Osterbauer R, Spence C (2007) Olfactory cues modulate facial attractiveness. Chem Senses 32(6):603–610
Bradford KD, Desrochers DM (2009) The use of scents to influence consumers: the sense of using scents to make cents. J Bus Ethics 90:141–153
Moessnang C, Finkelmeyer A, Vossen A, Schneider F, Habel U (2011) Assessing implicit odor localization in humans using a cross-modal spatial Cueing paradigm. PLoS One 6(12)
Gottfried JA, Dolan RJ (2003) The nose smells what the eye sees: Crossmodal visual facilitation of human olfactory perception. Neuron 39(2):375–386
Zhou W, Zhang X, Chen J, Wang L, Chen D (2012) Nostril-specific olfactory modulation of visual perception in binocular rivalry. J Neurosci 32(48):17225–17229
Blackwell L (1995) Visual cues and their effects on odour assessment. Nutr Food Sci 95(5):24–28
Yanagida Y, Tomono A (2012) Basics for olfactory display, in human olfactory displays and interfaces. In: Nakamoto T (ed) Odor sensing and presentation. IGI Global
Vaqso. https://vaqso.com. Last accessed 15 May 2023
Feelreal. https://feelreal.com/. Last accessed 15 May 2023
Olorama. https://www.olorama.com/en/. Last accessed 15 May 2023
Carulli M, Bordegoni M, Cugini U (2015) Integrating scents simulation in virtual reality multisensory environment for industrial products evaluation. Comput Aided Design Appl
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Bordegoni, M., Carulli, M., Spadoni, E. (2023). Multisensory Interaction in eXtended Reality. In: Prototy** User eXperience in eXtended Reality. SpringerBriefs in Applied Sciences and Technology(). Springer, Cham. https://doi.org/10.1007/978-3-031-39683-0_4
Download citation
DOI: https://doi.org/10.1007/978-3-031-39683-0_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-39682-3
Online ISBN: 978-3-031-39683-0
eBook Packages: Computer ScienceComputer Science (R0)