Log in

Abstract

Designing automotive Head-Up Displays (HUD) interfaces requires careful consideration of visual guidelines to ensure safety. While specific safety guidelines exist, a general set of visual guidelines has not yet been established. Therefore, this research presents a comprehensive methodology to derive overall visual guidelines designed to project warnings on HUD interfaces. To this end, the present work focused on asking 20 test subjects for driving in various scenarios, while visual stimuli were projected on a specific HUD system, identifying drivers’ behavior patterns and reaction trends. These visual stimuli were based on already tested visual guidelines. The results obtained from this methodology show that it is possible to integrate all previous qualitative and quantitative visual guidelines, allowing for drivers faster reactions and better recognition of warnings. This integration enables determining the most and the least suitable way for presenting information in a specific HUD system concerning identification mistakes and reaction times. Moreover, these findings imply the feasibility of anticipating a driver’s comprehension of warnings in HUD interfaces.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
EUR 32.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or Ebook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (United Kingdom)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Notes

  1. The LC ring is an optotype proposed by Edmund Landolt, which is focused on the visual acuity evaluation.

  2. Sport utility vehicle: mass = 1000 kg, motor revolution per minute (min-max) = 3.000–10.000, differential ratio = 3.67, front brake torque = 6000 Nm, back brake torque = 5500 Nm, wheel mass (4 wheels) = 23 kg, wheel radius = 0.4 m, dynamic friction (wheel-ground) = 0.07, static friction (wheel-ground) = 0.9.

  3. lane track (11.6 m wide) arranged as follows: straight line of 1119.63 m, curve of radius 162.23 m, straight line of 378.67 m, curve of radius 104.78 m, straight line of 1811.40 m, 3 consecutive curves of radius 108.90 m, 209.63 m, 209.63 m respectively, straight line of 198.58 m, 4 consecutive curves all of radius 211.60 m, straight line of 176.09 m, 2 consecutive curves of radius 213.68 m and 207.66 m respectively (downhill), straight line of 176.09 m, 3 consecutive curves of radius 206.58 m, 206.58 m, 190.24 m respectively (uphill). Good visibility conditions without traffic, but some obstacles were randomly included on the track, looking for avoiding learning effects as much as possible.

  4. The reflectance of the material I was analyzed for the visible spectrum, in order to consider the 5 different colors suggested in Sect. 3.3.

  5. VA is defined in Eq. (3) as proposed by Colenbrander [42]:

    $$ {\text{VA}} = {1}/\alpha \left[ {\text{arc min}} \right] $$
    (3)

    where α is the angle that the gap size (b value, see Fig. 7) of the LC rings subtends at the test subjects’ eyes.

  6. E.g. the G1 group refers to all the projected warnings using 10 mm and 15 mm of size, with 2 s of exposition time, in any of the positions, delay times, and colors here analyzed.

References

  1. Ando, S., Kida, N., Oda, S.: Practice effects on reaction time for peripheral and central visual fields. Percept. Mot. Skills 95(3), 747–752 (2002)

    Article  Google Scholar 

  2. Frison, A.K., Forster, Y., Wintersberger, P., Geisel, V., Riener, A.: Where we come from and where we are going: a systematic review of human factors research in driving automation. Appl. Sci. (Switzerland) 10(24), 1–36 (2020). https://doi.org/10.3390/app10248914

    Article  Google Scholar 

  3. Götze, M., Bißbort, F., Petermann-Stock, I., & Bengler, K: A careful driver is one who looks in both directions when he passes a red light. Increased demands in urban traffic (2014).

  4. Häuslschmid, R., Osterwald, S., Lang, M., Butz, A.: Augmenting the driver’s view with peripheral information on a windshield display. International Conference on Intelligent User Interfaces. Proceedings IUI, 311–321. https://doi.org/10.1145/2678025.2701393 (2015)

  5. Heymann, M. and Degani, A.: Classification and organization of information. In: Design of multimodal mobile interfaces, DE GRUYTER, 195–217 (2016).

  6. Feierle, A., Beller, D., Bengler, K.: Head-up displays in urban partially automated driving: effects of using augmented reality. IEEE Intelligent Transportation Systems Conference (ITSC), 1877–1882 (2019)

  7. Continental: User Experience Head-up Displays. http://continental-head-updisplay.com (2016). Accessed 20 March 2022

  8. Pečečnik, K., Tomažič, S., Sodnik, J.: Design of head-up display interfaces for automated vehicles. Int. J. Human Comput. Stud. (2023). https://doi.org/10.1016/j.ijhcs.2023.103060

    Article  Google Scholar 

  9. Langlois, S., Soualmi, B.: Augmented reality versus classical HUD to take over from automated driving: An aid to smooth reactions and to anticipate maneuvers. IEEE 19th International Conference on Intelligent Transportation Systems, 1571–1578 (2016)

  10. Tretten, P., Gȁrling, A., Nilsson, R., Larsson, T.C.: An onroad study on head-up display preferred location and acceptance levels. Proc. Human Factors Ergon. Soc. 55(1), 1914–1918 (2011)

    Article  Google Scholar 

  11. Yoo, H., Tsimhoni, O., Watanabe, H., Green, P., Shah, R.: Display of HUD Warnings to Drivers: Determining an Optimal Location, (Technical Report UMTRI-99-5, ITS RCE report #939423), Ann Arbor. University of Michigan Transportation Research Institute, MI (1999)

    Google Scholar 

  12. Ma, X., Jia, M., Hong, Z., Kwok, A.PKi., Yan, M.: Does augmented-reality head-up display help? A preliminary study on driving performance through a VR-simulated eye movement analysis. IEEE Access 9, 129951–129964 (2021). https://doi.org/10.1109/ACCESS.2021.3112240

    Article  Google Scholar 

  13. Liu, Y.C., Wen, M.-H.: Comparison of head-up display (HUD) vs. head-down display (HDD): driving performance of commercial vehicle operators in taiwan. Int. J. Human Comput. Stud. 61(5), 679–697 (2004). https://doi.org/10.1016/j.ijhcs.2004.06.002

    Article  Google Scholar 

  14. Currano, R., Park, S.Y., Moore, D.J., Lyons, K., Sirkin, D.: Little road driving hud: Heads-up display complexity influences drivers’ perceptions of automated vehicles. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 1–15 (2021)

  15. Gabbard, J. L., Fitch, G.M., Kim, H.: Driver queries using wheel-constrained finger pointing and 3-D head-up display visual feedback. Proceeding of the 5th International conference on automotive User interfaces and Interactive vehicular application, 52–62 (2014)

  16. Fujimura, K., Xu, L., Tran, C., Bhandari, R., Ng-Thow-Hing, V.: Driver queries using wheel-constrained finger pointing and 3-D head-up display visual feedback. Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 56–62 (2013)

  17. Horrey, W.J., Wicken, C.D., Alexander, A.L.: The effects of head-up display clutter and in-vehicle display separation on concurrent driving performance. Proc. Human Factors Ergon. Soc. Annu. Meet. 47(16), 1880–1884 (2003)

    Article  Google Scholar 

  18. Riegler, A., Riener, A., Holzmann, C.: Augmented reality for future mobility: insights from a literature review and HCI workshop. I-Com 20(3), 295–318 (2021). https://doi.org/10.1515/icom-2021-0029

    Article  Google Scholar 

  19. Jakus, G., Dicke, C., Sodnik, J.: A user study of auditory, head-up and multi-modal displays in vehicles. Appl. Ergon. 46, 184–192 (2015). https://doi.org/10.1016/j.apergo.2014.08.008

    Article  Google Scholar 

  20. Pauzie, A.: Head up display in automotive: a new reality for the driver. International Conference of Design, User Experience and Usability, 505–516 (2015)

  21. Lüke, S., Fochler, O., Schaller, T., Regensburger, T.: Stauassistenz und -automation, in Handbuch Fahrerassistenzsysteme, 3. Auflage, ed. by H.~Winner, S. Hakuli, F. Lotz, C. Singer (Springer Vieweg, Wiesbaden, 2015).

  22. Miličić, N.: Sichere und ergonomische Nutzung von Head_up_Displays im Fahrzeug. Dissertation, TUMunchen (2010)

  23. Raubitschek, C.: Prioritätenorientierte Implementierung einer Menüinteraktion im Head-Up Display für den Automobilbereich. Diplomarbeit, Lehrstuhl für Mensch-Maschine-Kommunikation, TUM, München (2008).

  24. Park, J., Park, W.: A review on the interface design of automotive head-up displays for communicating safety-related information. Proc Hum Factors Ergon Soc Annu Meet 63(1), 2016–2017 (2019). https://doi.org/10.1177/1071181319631099

    Article  Google Scholar 

  25. Merenda, C., Smith, M., Gabbard, J., Burnett, G., Large, D.: Effects of real-world backgrounds on user interface color naming and matching in automotive AR HUDs. IEEE Workshop on Perceptual and Cognitive Issues in AR, 57–68. https://doi.org/10.1109/PERCAR.2016.7562419 (2016)

  26. Smith, S., Fu, S.: The relationships between automobile head-up display presentation images and drivers’ kansei. Displays 32(2), 58–68 (2011). https://doi.org/10.1016/j.displa.2010.12.001

    Article  Google Scholar 

  27. Brown, A.S., Birman, V., Miciuda, E.: Optimization suggestions for instrument-cluster information using displays. J. Soc. Inform. Display 19(10), 665–670 (2011). https://doi.org/10.1889/JSID19.10.665

    Article  Google Scholar 

  28. Alves, P. R. J. A., Goncalves, J., Rossetti, R. J. F., Oliveira, E. C., Olaverri-Monreal, C.: Forward collision warning systems using heads-up displays: Testing usability of two new metaphors. IEEE Intelligent Vehicles Symposium, 1–6. https://doi.org/10.1109/IVS.2013.6629438 (2013)

  29. Beck, D., Jung, J., Park, J., Park, W.: A study on user experience of automotive HUD systems: contexts of information use and user-perceived design improvement points. Int. J. Hum.-Comput. Interact. 35(20), 1936–1946 (2019). https://doi.org/10.1080/10447318.2019.1587857

    Article  Google Scholar 

  30. Li, X., Schroeter, R., Rakotonirainy, A., Kuo, J., Lenné, M.G.: Effects of different non-driving-related-task display modes on drivers’ eye-movement patterns during take-over in an automated vehicle. Transport. Res. F: Traffic Psychol. Behav. 70, 135–148 (2020)

    Article  Google Scholar 

  31. Helander, M., Landauer, T., Prabhu, P.: Handbook of Human-Computer Interaction. Elsevier (1997)

    Google Scholar 

  32. Campbell, F.W.: The depth of field of the human eye. Opt. Acta 4, 157–164 (1957)

    Article  Google Scholar 

  33. Velger, M.: Helmet-Mounted Displays and Sights. Artech House, Boston, London (1997)

    Google Scholar 

  34. Schomig, N., Wiedemann, K., Naujoks, F., Neukum, A., Leuchtenberg, B., Vohringer- Kuhnt, T.: An augmented reality display for conditionally automated driving. Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 137–141 (2018)

  35. Peli E.: Optometric and perceptual issues with head-mounted display (HMD). In: Mouroulis P, ed. Optical Design for Visual Instrumentation. McGraw-Hill, New York. pp. 205–276. (1999)

  36. Singh, S. S., Pattnaik, S. S., Sardana, H. K., Bajpai, P. P. :Analysis of errors and distortions in stroke form of symbology for head-up displays. Paper presented at the Lecture Notes in Engineering and Computer Science 2196, 1302-1306 (2012)

  37. McPhee, L.C., Scialfa, C.T., Dennis, W.M., Ho, G., Caird, J.K.: Age differences in visual search for traffic signs during a simulated conversation. Hum. Factors 46(4), 674–685 (2004). https://doi.org/10.1518/hfes.46.4.674.56817

    Article  Google Scholar 

  38. Edgar, G.K.: Accommodation, cognition, and virtual image displays: a review of the literature. Displays 28(2), 45–59 (2007). https://doi.org/10.1016/j.displa.2007.04.009

    Article  Google Scholar 

  39. Bach, M.: Manual of the Freiburg Vision Test ‘FrACT’, Version 3.9.8. http://www.michaelbach.de/fract/media/FrACT3_Manual.pdf. Accessed 10 March 2017

  40. Peters, M., Ivanoff, J.: Performance asymmetries in computer mouse control of right-handers, and left-handers with left- and right-handed mouse experience. J. Mot. Behav. 31(1), 86–94 (1999)

    Article  Google Scholar 

  41. Maruyama, A., Takahashi, K., Rothwell, J.C.: Interaction between left dorsal premotor and right primary motor cortex during a left hand visual go/no-go reaction time. Brain Stimul.Stimul. 1(3), 255 (2008). https://doi.org/10.1016/j.brs.2008.06.081

    Article  Google Scholar 

  42. Colenbrander A. Consilium Ophthalmologicum Universale Visual Functions Committee. Visual acuity measurement standard. Italian Journal of Ophthalmology. 2(1), 1–15 (1988)

  43. USA Department of Defense: Human engineering design criteria for military systems, equipment and facilities (MIL-STD1472 F). Navy Publishing and Printing Office, Philadelphia, PA (1998)

    Google Scholar 

  44. Woodworth, R.S., Schlosberg, H.: Experimental Psychology. Henry Holt, New York (1954)

    Google Scholar 

  45. Pastor MA., Artieda J. (Eds.): Time, Internal Clocks, and Movement, Elsevier (1996).

  46. Artieda, J., Pastor, M.A., Lacruz, F., Obeso, J.A.: Temporal discrimination is abnormal in Parkison disease. Brain 115, 199–210 (1992)

    Article  Google Scholar 

  47. Green, J.B., Reese, C.L., Pegues, J.J., Eliot, F.A.: Ability to distinguish two cutaneous stimuli separated by a brief time interval. Neurology 11, 1006–1010 (1961). https://doi.org/10.1212/wnl.11.11.1006

    Article  Google Scholar 

  48. Knott, V. C., Demmelmair, S., Bengler, K.: Distraction and driving behavior by presenting information on an “emissive projection display” compared to a head-up display. Proceedings of the 12th International Conference Engineering Psychology and Cognitive Ergonomics https://doi.org/10.1007/978- 3–319–20373–7_2 (2015)

  49. Betancur, J.A., Suarez, D.: System and method for interacting with a mobile device using a head-up display. United States Trademark and Patent Office, Alexandria (2023)

    Google Scholar 

  50. Lif, P., Oskarsson, P.A., Lindahl, B., Hedström, J., Svensson, J.: Multimodal threat cueing in simulated combat vehicle with tactile information switching between threat and waypoint indication. In: Symposium on Human Interface, pp. 454–461. Springer (2011)

  51. François, M., Osiurak, F., Fort, A., Crave, P., Navarro, J.: Automotive HMI design and participatory user involvement: review and. Ergonomics 60, 541–552 (2016). https://doi.org/10.1080/00140139.2016.1188218

    Article  Google Scholar 

  52. Betancur, J.A., Gómez, N., Castro, M., Suárez, D., Merienne, F.: User experience comparison among touchless, haptic and voice head-up displays interfaces in automobiles. Int. J. Interact. Des. Manuf. 12, 1469–1479 (2018). https://doi.org/10.1007/s12008-018-0498-0

    Article  Google Scholar 

  53. Betancur, J.A., Villa-Espinal, J., Osorio-Gómez, G., Cuellar, S., Suárez, D.: Research topics and implementation trends on automotive head-up display systems. Int. J. Interact. Des. Manuf. 12, 199–214 (2018). https://doi.org/10.1007/s12008-016-0350-3

    Article  Google Scholar 

  54. Ware, C.: Information visualization: perception of design. Morgan Kaufmann, San Francisco (2019)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to J. Alejandro Betancur.

Ethics declarations

Conflict of interest

There are no relevant financial or non-financial competing interests related to this work.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Betancur, J.A., Vargas, H., Sanchez, C. et al. Visual guidelines integration for automotive head-up displays interfaces. Int J Interact Des Manuf (2024). https://doi.org/10.1007/s12008-024-01877-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s12008-024-01877-0

Keywords

Navigation