Abstract
Social robots are artificial socially intelligent partners, designed to interact with humans in various contexts. If well accepted by users, they can accomplish tasks (e.g., personal assistant/companion), which are particularly relevant when other humans are absent and improve the quality of life. As the main purpose of social robots is to interact with humans, they must have the ability to establish and maintain a relationship. In this context, the chapter introduces Human-Robot Interaction as interaction became more important, especially with social robots, due to the recent move of robotics from the industrial environment to the human environment. Various factors such as uncanny valley, proxemics, empathy, trust, engagement, and emotional design affect the interaction with a social robot and are explained in the chapter.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Adams B, Breazeal C, Brooks RA, Scassellati B (2000) Humanoid robots: a new kind of tool. IEEE Intell Syst Their Appl 15(4):25–31
Admoni H, Scassellati B (2017) Social eye gaze in Human-Robot Interaction: a review. J Hum-Robot Interact 6(1):25. https://doi.org/10.5898/JHRI.6.1.Admoni
Alonso-Martín F, Castro-gonzález A, Javier F, De Gorostiza F, Salichs MÁ (2015) Augmented robotics dialog system for enhancing Human–Robot Interaction, pp 15799–15829. https://doi.org/10.3390/s150715799
Argyle M (1975) Bodily communication. International Universities Press, USA
Aylett R, Castellano G, Raducanu B, Paiva A, Hanheide M (2011) Long-term socially perceptive and interactive robot companions: challenges and future perspectives. In: Proceedings of the 13th …, January 2011, pp 323–326. https://doi.org/10.1145/2070481.2070543
Bartneck C (2002) eMuu—an embodied emotional character for the ambient intelligent home. Technische Universiteit Eindhoven
Beck A, Cañamero L, Bard KA (2010) Towards an affect space for robots to display emotional body language. In: Proceedings—IEEE international workshop on robot and human interactive communication, pp 464–469. https://doi.org/10.1109/ROMAN.2010.5598649
Bonarini A, Garzotto F, Gelsomini M, Romero M, Clasadonte F, Yilmaz ANC (2016) A huggable, mobile robot for developmental disorder interventions in a multi-modal interaction space. In: 25th IEEE international symposium on robot and human interactive communication, RO-MAN 2016, pp. 823–830. https://doi.org/10.1109/ROMAN.2016.7745214
Breazeal C (2002) Designing sociable robots. In: Computers & mathematics with applications. Mit Press, London, England. https://doi.org/10.1016/S0898-1221(03)80129-3
Breazeal C (2009) Role of expressive behaviour for robots that learn from people. Philos Trans R Soc B Biol Sci 364(1535):3527–3538. https://doi.org/10.1098/rstb.2009.0157
Breazeal C, Buchsbaum D, Gray J, Gatenby D, Blumberg B (2005) Learning from and about others: towards using imitation to bootstrap the social understanding of others by robots. Artif Life 11(1–2):31–62. https://doi.org/10.1162/1064546053278955
Breazeal C, Scassellati B (1999) How to build robots that make friends and influence people. In: Proceedings 1999 IEEE/RSJ international conference on intelligent robots and systems. Human and environment friendly robots with high intelligence and emotional quotients (Cat. No. 99CH36289), vol 2, pp 858–863. https://doi.org/10.1109/IROS.1999.812787
Brooks RA, Breazeal C, Marjanovic M, Scassellati B, Williamson MM (1999) The cog project: building a humanoid robot. In: Lecture notes in computer science, vol 1562, pp 52–87. https://doi.org/10.1007/3-540-48834-0_5
Cameron D, Aitken JM, Collins EC, Boorman L, Chua A, Fernando S, Law J et al (2015a) Framing factors: the importance of context and the individual in understanding trust in Human-Robot Interaction. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), workshop on designing and evaluating social robots for public settings
Cameron D, Fernando S, Collins E, Millings A, Moore RK, Sharkey A, Prescott T (2015b) Presence of life-like robot expressions influences children’s enjoyment of Human-Robot Interactions in the field. In: Fourth international symposium on new frontiers in Human-Robot Interaction, AISB-2015, pp 1–6. https://doi.org/10.1016/j.ijhcs.2015.01.006
Cañamero LD, Fredslund J (2000) How does it feel? Emotional interaction with a humanoid LEGO robot. In: AAAI 2000 fall symposium—socially intelligent agents: the human in the loop, technical, pp 23–28. Retrieved from http://www.aaai.org/Library/Symposia/Fall/2000/FS-00-04/fs-00-04.html
Carroll JM (ed) (2003) HCI models, theories, and frameworks: toward a multidisciplinary science. Elsevier
Cheetham M (2017) Editorial: the uncanny valley hypothesis and beyond. Front Psychol 8(Oct):1–3. https://doi.org/10.3389/fpsyg.2017.01738
Cramer H, Goddijn J, Wielinga B, Evers V (2010) Effects of (in)accurate empathy and situational valence on attitudes towards robots. In: 2010 5th ACM/IEEE international conference on Human-Robot Interaction (HRI), pp 141–142. https://doi.org/10.1109/HRI.2010.5453224
Dautenhahn K (2007) Socially intelligent robots: dimensions of Human-Robot Interaction. Philos Trans R Soc B Biol Sci 362(1480):679–704. https://doi.org/10.1098/rstb.2006.2004
Dautenhahn K (2013) Human-Robot Interaction. In: Soegaard M, Dam RF (eds) The encyclopedia of human-computer interaction, 2nd edn. Interaction Design Foundation
Dautenhahn K, Billard A (1999) Bringing up robots or—the psychology of socially intelligent robots: from theory to implementation. In: Proceedings of the third annual conference on autonomous agents, pp 366–367. https://doi.org/10.1145/301136.301237
Dautenhahn K, Nehaniv CL, Walters ML, Robins B, Kose-Bagci H, Mirza NA, Blow M (2009) KASPAR—a minimally expressive humanoid robot for Human-Robot Interaction research. Appl Bion Biomech 6(3):369–397. https://doi.org/10.1080/11762320903123567
Dautenhahn K, Woods S, Kaouri C, Walters ML, Koay KL, Werry I (2005) What is a robot companion—friend, assistant or butler? In: 2005 IEEE/RSJ international conference on intelligent robots and systems, IROS, pp 1488–1493. https://doi.org/10.1109/IROS.2005.1545189
Demers LP (2012) The blind robot: project description. Retrieved from http://www.processing-plant.com/web_csi/index.html#project=blind
Demir E, Desmet P, Hekkert P (2009) Appraisal patterns of emotions in human-product interaction. International journal of design. Retrieved from http://www.ijdesign.org/ojs/index.php/IJDesign/article/view/587
Desmet PMA (2002) Designing emotions. Delft University of Technology
Deshmukh A, Janarthanam S, Hastie H, Lim MY, Aylett R, Castellano G (2016) How expressiveness of a robotic tutor is perceived by children in a learning environment. In: ACM/IEEE international conference on Human-Robot Interaction, 2016–April, pp 423–424. https://doi.org/10.1109/HRI.2016.7451787
DiSalvo CF, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal. In: Proceedings of the conference on designing interactive systems processes, practices, methods, and techniques—DIS ’02, January, p 321. https://doi.org/10.1145/778712.778756
Dix A, Finlay J, Abowd G, Beale R (2004) Human-computer interaction, 3rd edn. Pearson Prentice Hall
Edwards C, Edwards A, Spence PR, Westerman D (2016) Initial interaction expectations with robots: testing the human-to-human interaction script. Commun Stud 67(2):227–238. https://doi.org/10.1080/10510974.2015.1121899
Fan L, Scheutz M, Lohani M, McCoy M, Stokes C (2017) Do we need emotionally intelligent artificial agents? First results of human perceptions of emotional intelligence in humans compared to robots. In: Lecture notes in computer science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 10498 LNAI, pp 129–141. https://doi.org/10.1007/978-3-319-67401-8_15
Fong T, Nourbakhsh I, Dautenhahn K (2002) A survey of socially interactive robots: concepts, design and applications. Technical report CMU-RI-TR-02–29. https://doi.org/10.1016/S0921-8890(02)00372-X
Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots. Robot Auton Syst 42:143–166. https://doi.org/10.1016/S0921-8890(02)00372-X
Freedy A, DeVisser E, Weltman G, Coeyman N (2007, May) Measurement of trust in human-robot collaboration. In: International symposium on collaborative technologies and systems, May 2007. CTS 2007. IEEE, pp 106–114
Gaudiello I, Zibetti E, Lefort S, Chetouani M, Ivaldi S (2016) Trust as indicator of robot functional and social acceptance. An experimental study on user conformation to iCub answer. Comput Hum Behav 61:633–655
Giambattista A, Teixeira L, Ayanoğlu H, Saraiva M, Duarte E (2016) Expression of emotions by a service robot: a pilot study. In: International conference of design, user experience, and usability. Springer, Cham, pp 328–336. https://doi.org/10.1007/978-3-319-40406-6_31
Goetz J, Kiesler S, Powers A (2003) Matching robot appearance and behavior to tasks to improve human-robot cooperation. In: RO-MAN ’03: proceedings of the 12th IEEE international workshop on robot and human interactive communication, pp 55–60. https://doi.org/10.1109/ROMAN.2003.1251796
Goodrich MA, Schultz AC (2007) Human–Robot Interaction: a survey. Found Trend Human–Computer Interact 1(3):203–275. https://doi.org/10.1561/1100000005
Goris K, Saldien J, Vanderborght B, Lefeber D (2010) Probo, an intelligent huggable robot for HRI studies with children. In: Chugo D (ed) Human-Robot Interaction, pp 33–42. https://doi.org/10.5772/8129
Haddadin S (2014) Towards safe robots approaching Asimov’s 1st law (Siciliano B, Khatib O eds). Springer
Hall ET (1966) The hidden dimension. Journal of chemical information and modeling. Doubleday, New York, NY. https://doi.org/10.1017/CBO9781107415324.004
Han J (2012) Emerging technologies: robot assisted language learning. Lang Learn Technol 16(3):1–9. Retrieved from http://llt.msu.edu/issues/october2012/emerging.pdf
Hancock PA, Billings DR, Schaefer KE, Chen JYC, De Visser EJ (2011) Human factors: the journal of the human factors and ergonomics society. https://doi.org/10.1177/0018720811417254
Hashimoto T, Hitramatsu S, Tsuji T, Kobayashi H (2006) Development of the face robot SAYA for rich facial expressions. In: 2006 SICE-ICASE international joint conference, pp 5423–5428. https://doi.org/10.1109/SICE.2006.315537
Heerink M, Kröse B, Evers V, Wielinga B (2010) Assessing acceptance of assistive social agent technology by older adults: the almere model. Int J Social Robot 2(4):361–375. https://doi.org/10.1007/s12369-010-0068-5
Hoffman G, Ju W (2014) Designing robots with movement in mind. J Hum-Robot Interact 3(1):89. https://doi.org/10.5898/JHRI.3.1.Hoffman
Holmquist LE, Forlizzi J (2014) Introduction to journal of Human-Robot Interaction special issue on design. J Hum-Robot Interact 5(3):1. https://doi.org/10.5898/JHRI.5.3.Darling
Hwang J, Park T, Hwang W (2013) The effects of overall robot shape on the emotions invoked in users and the perceived personalities of robot. Appl Ergon 44(3):459–471. https://doi.org/10.1016/j.apergo.2012.10.010
Interaction (2018) In oxforddictionaries.com. Retrived March 18, 2018, from https://en.oxforddictionaries.com/definition/interaction
International Organization for Standardization (2012) Robots and robotic devices—vocabulary (ISO/DIS Standard No. 8373:2012). Retrieved from https://www.iso.org/obp/ui/#iso:std:55890:en
Japan Robot Association (2001) Summary report on technology strategy for creating a robot society in the 21st century. http://www.jara.jp/e/dl/report0105.pdf. Accessed 06 June 2016
Kanda T, Ishiguro H (2013) Human-Robot Interaction in social robotics. CRC Press. https://doi.org/10.1201/b13004
Kanda T, Iwase K, Shiomi M, Ishiguro H (2005) A tension-moderating mechanism for promoting speech-based Human-Robot Interaction. In: 2005 IEEE/RSJ international conference on intelligent robots and systems, IROS, pp 511–516. https://doi.org/10.1109/IROS.2005.1545035
Kanda T, Iwase K, Shiomi M, Ishiguro H (2013) Moderating users’ tension to enable them to exhibit other emotions. In: Kanda T, Ishiguro H (eds) Human-Robot Interaction in social robots. CRC Press, Taylor & Francis Group, New York, pp 299–311
Kätsyri J, Förger K, Mäkäräinen M, Takala T (2015) A review of empirical evidence on different uncanny valley hypotheses: support for perceptual mismatch as one road to the valley of eeriness. Front Psychol 6(MAR):1–16. https://doi.org/10.3389/fpsyg.2015.00390
Kim EH, Kwak SS, Hyun KH, Kim SH, Kwak YK (2009) Design and development of an emotional interaction robot, mung. Adv Robot 23(6):767–784. https://doi.org/10.1163/156855309X431712
Kozima H, Nakagawa C, Yano H (2004) Can a robot empathize with people? Artif Life Robot 8(1):83–88. https://doi.org/10.1007/s10015-004-0293-9
Kraft B (2017) Can the Uncanny Valley be bridged ? An evaluation of stylization and design of realistic human characters
Kubinyi E, Miklósi Á, Kaplan F, Gácsi M, Topál J, Csányi V (2004) Social behaviour of dogs encountering AIBO, an animal-like robot in a neutral and in a feeding situation. Behav Proc 65(3):231–239. https://doi.org/10.1016/j.beproc.2003.10.003
Kulic D, Croft EA (2007) Affective state estimation for Human–Robot Interaction. IEEE Trans Rob 23(5):991–1000. https://doi.org/10.1109/TRO.2007.904899
Lasota PA, Fong T, Shah JA (2017) A survey of methods for safe Human-Robot Interaction. Found Trends Robot 5(3):261–349. https://doi.org/10.1561/2300000052
Lewandowska-Tomaszczyk B, Wilson PA (2016) Compassion, empathy and sympathy expression features in affective robotics. In: 2016 7th IEEE international conference on cognitive infocommunications (CogInfoCom), October 2016. IEEE, pp 000065–000070
Litzenberger G, Hägele M (2017) Presentation world robotics 2017 service robots, October 2017
Luengo JF, Martín FA, Castro-González Á, Salichs MÁ (2017) Sound synthesis for communicating nonverbal expressive cues
Mathur MB, Reichling DB (2016) Navigating a social world with robot partners: a quantitative cartography of the Uncanny Valley. Cognition 146:22–32. https://doi.org/10.1016/j.cognition.2015.09.008
Mead R, Matari MJ (2016a) Perceptual models of human-robot proxemics. Exp Robot:261–276
Mead R, Matari MJ (2016b) Robots have needs too: how and why people adapt their proxemic behavior to improve robot social signal understanding. J Hum-Robot Interact 5(2):48–68. https://doi.org/10.5898/JHRI.5.2.Mead
Mehrabian A (1968) Communication without words. Psychol Today 2(4):53–56
Mori M (1970) The Uncanny Valley. Energy 7(4):33–35
Mumm J, Mutlu B (2011) Human-robot proxemics: physical and psychological distancing in Human-Robot Interaction. In: Proceedings of the 6th international conference on Human-Robot Interaction, pp 331–338. https://doi.org/10.1145/1957656.1957786
Mutlu B, Kanda T, Forlizzi J, Hodgins J, Ishiguro H (2012) Conversational gaze mechanisms for humanlike robots. ACM Trans Interact Intell Syst (TiiS) 1(2):1–33. https://doi.org/10.1145/2070719.2070725
Nehaniv CL, Dautenhahn K, Kubacki J, Haegele M, Parlitz C (2005) A methodological approach relating the classification of gesture to identification of human intent in the context of Human-Robot Interaction, pp 371–377
Nejat G, Sun Y, Nies M (2009) Assistive robots in health care settings. Home Health Care Manag Pract 21(3):177–187
Nishida T, Jain LC, Faucher C (eds) (2010) Modeling machine emotions for realizing intelligence foundations and applications
Norman DA (2004) Emotional design: why we love (or hate) everyday things. Basic Civitas Books
O’Brien HL, Toms EG (2008) What is user enagement? A conceptual framework for defining user engagement with technology. J Am Soc Inform Sci Technol 59(6):938–955. https://doi.org/10.1002/asi.20801.1
Paiva A (2011) Empathy in social agents. Int J Virtual Rity 10(1):65–68. Retrieved from http://dl.lirec.org/papers/Paiva-IJVR2011.pdf
Paiva A, Leite I, Boukricha H, Wachsmuth I (2017) Empathy in virtual agents and robots. ACM Trans Interact Intell Syst 7(3):1–40. https://doi.org/10.1145/2912150
Picard RW (2000) Toward computers that recognize and respond to user emotion. IBM Syst J 39(3.4):705–719. https://doi.org/10.1147/sj.393.0705
Plutchik R (1987) Evolutionary bases of empathy. Empathy Its Dev 1:38-46
Ribeiro T, Paiva A (2012) The illusion of robotic life. In Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction—HRI ’12, pp 383–390. https://doi.org/10.1145/2157689.2157814
Robi-X Case (2018, February 12). Retrieved from https://blue-ocean-robotics.com/category/rxcase/
Saldien J, Goris K, Yilmazyildiz S, Verhelst W, Lefeber D (2008) On the design of the huggable robot probo. J Phys Agents 2(2):3–11. https://doi.org/10.1142/S0219843611002563
Salem M, Dautenhahn K (2015) Evaluating trust and safety in HRI: practical issues and ethical challenges. In: The emerging policy and ethics of human robot interaction a workshop at 10th ACM/IEEE international conference on Human-Robot Interaction (HRI 2015)
Salem M, Kopp S, Wachsmuth I, Rohlfing K, Joublin F (2012) Generation and evaluation of communicative robot gesture. Int J Social Robot 4(2):201–217. https://doi.org/10.1007/s12369-011-0124-9
Salem M, Lakatos G, Amirabdollahian F, Dautenhahn K, Lane C (2015) Would you trust a (Faulty) robot? Effects of error, task type and personality on human-robot cooperation and trust. In: Proceedings of the tenth annual ACM/IEEE international conference on Human-Robot Interaction. ACM, pp 141–148
Schindler S, Zell E, Botsch M, Kissler J (2017) Differential effects of face-realism and emotion on event-related brain potentials and their implications for the uncanny valley theory. Sci Rep 7(February):1–13. https://doi.org/10.1038/srep45003
Sekmen A, Challa P (2013) Assessment of adaptive Human–Robot Interactions. Knowl-Based Syst 42:49–59. https://doi.org/10.1016/j.knosys.2013.01.003
Sequeira J, Lima P, Saffiotti A, Gonzalez-Pacheco V, Salichs MA (2013) MOnarCH: multi-robot cognitive systems operating in hospitals. In: ICRA 2013 workshop on many robot systems
Shimada M, Kanda T (2012) What is the appropriate speech rate for a communication robot? Interact Stud 13(3):408–435. https://doi.org/10.1075/is.13.3.05shi
Sidner CL, Dzikovska M (2005) A first experiment in engagement for Human-Robot Interaction in hosting activities. Adv Nat Multimodal Dialogue Syst 30:55–76. https://doi.org/10.1007/1-4020-3933-6_3
Siegel MS (2009) Persuasive robotics: how robots change our minds. School of Architecture and Planning, Program in Media Arts and Sciences
Simon M (2017, November 30) The genesis of Kuri, the friendly home robot. Retrieved from https://www.wired.com/story/the-genesis-of-kuri/
Siregar RF, Syahputra R, Mustar MY (2017) Human-Robot Interaction based GUI, 1(1), 10–19
Sodnik J, Tomažič S (2015) Spatial auditory human-computer interfaces. Springer
Spence PR, Westerman D, Edwards C, Edwards A (2014) Welcoming our robot overlords: initial expectations about interaction with a robot. Commun Res Rep 31:272–280. https://doi.org/10.1080/08824096.2014.924337
Stein JP, Ohler P (2017) Venturing into the uncanny valley of mind—the influence of mind attribution on the acceptance of human-like characters in a virtual reality setting. Cognition 160:43–50. https://doi.org/10.1016/j.cognition.2016.12.010
Takayama L, Pantofaru C (2009) Influences on proxemic behaviors in Human-Robot Interaction. In: IEEE/RSJ international conference on intelligent robots and systems. St. Louis, MO, USA, pp 5495–5502
Tanaka F, Noda K, Sawada T, Fujita M (2004) Associated emotion and its expression in an entertainment robot QRIO. Entertainment computing–ICEC …, pp 1–6. Retrieved from http://springer.longhoe.net/chapter/10.1007/978-3-540-28643-1_64
Tanevska A, Rea F, Sandini G, Sciutti A, Tanevska A, Rea F, Tanevska A et al (2017) Towards an affective cognitive architecture for Human-Robot Interaction for the iCub robot
Tapus A, Matarić MJ (2008) Socially assistive robots: the link between personality, empathy, physiological signals, and task performance. In: AAAI spring symposium, pp 133–141. Retrieved from http://www.aaai.org/Papers/Symposia/Spring/2008/SS-08-04/SS08-04-021.pdf
Wada K, Shibata T (2007) Robot therapy in a care house—change of relationship among the residents and seal robot during a 2-month long study. In: Proceedings—IEEE international workshop on robot and human interactive communication, vol 23, pp 972–980. https://doi.org/10.1109/ROMAN.2007.4415062
Walters ML, Dautenhahn K, Koay KL, Kaouri C, Boekhorst R, Nehaniv C, Dautenhahn K et al (2005) Close encounters: spatial distances between people and a robot of mechanistic appearance, pp 450–455
Walters ML, Syrdal DS, Koay KL, Dautenhahn K (2008) Human approach distances to a mechanical-looking robot with different robot voice styles, pp 707–712
Wang L, Tan KC, Chew CM (2006) Evolutionary robotics—from algorithms to implementations world scientific series in robotics and intelligent systems, vol 28. World Scientific Publishing Co. Pte. Ltd., Singapore
Wilson M (2017, December 1) Can robots really be companions to elderly people? Retrived from https://www.fastcodesign.com/3067150/can-robots-really-be-companions-to-elderly-people
Woods S, Dautenhahn K, Schulz J (2004) The design space of robots: investigating children’s views. In: RO-MAN 2004. 13th IEEE international workshop on robot and human interactive communication (IEEE Catalog No. 04TH8759), pp 47–52. https://doi.org/10.1109/ROMAN.2004.1374728
World Robotics (2013) World robotics 2013 service robots. Retrieved from www.worldrobotics.org
Wynne E (2016, May 18) The art gallery of WA has world’s first robot tour guide: Aggie. Retrieved from http://www.abc.net.au/news/2016-05-18/art-gallery-of-wa-introduces-robot-tour-guide-aggie/7424760
Yamada Y, Kawabe T, Ihaya K (2013) Categorization difficulty is associated with negative evaluation in the “uncanny valley” phenomenon. Jpn Psychol Res 55(1):20–32. https://doi.org/10.1111/j.1468-5884.2012.00538.x
Young JE, Hawkins R, Sharlin E, Igarashi T (2009) Toward acceptable domestic robots: applying insights from social psychology, pp 95–108. https://doi.org/10.1007/s12369-008-0006-y
Zecca M, Mizoguchi Y, Endo K, Iida F, Kawabata Y, Endo N, Takanishi A et al (2009) Whole body emotion expressions for KOBIAN humanoid robot—preliminary experiments with different emotional patterns. In: Proceedings—IEEE international workshop on robot and human interactive communication, pp 381–386. https://doi.org/10.1109/ROMAN.2009.5326184
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Ayanoğlu, H., Sequeira, J.S. (2019). Human-Robot Interaction. In: Ayanoğlu, H., Duarte, E. (eds) Emotional Design in Human-Robot Interaction. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-319-96722-6_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-96722-6_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-96721-9
Online ISBN: 978-3-319-96722-6
eBook Packages: Computer ScienceComputer Science (R0)