Log in

The Power of Personal Ontologies: Individual Traits Prevail Over Robot Traits in Sha** Robot Humanization Perceptions

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

This study examines facets of robot humanization, defined as how people think of robots as social and human-like entities through perceptions of liking, human-likeness, and rights entitlement. The current study investigates how different trait differences in robots (gender, physical humanness, and relational status) and participants (trait differences in past robot experience, efficacy, and personality) together influence humanization perceptions. Findings show that the robots’ features were less influential than participants’ individual traits. Specifically, participants’ prior real-life exposure to robots and perceived technology competence were positively related to robot humanization, while individuals with higher internal loci of control and negative evaluations of robots in media were less inclined to humanize robots. The implications of these findings for understanding the unfolding “relational turn” in human-machine communication are then considered: specifically, at present, it appears that technological features matter less than people’s ontological understanding of social robots in sha** their humanization perceptions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Germany)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Data Availability

Request for data can be sent to the corresponding author.

Code Availability

Not applicable.

Notes

  1. Qualtrics has its own compensation structure to pay participants, so the exact amount participants received for completing the survey is unknown. We paid Qualtrics $4.50/participant to complete the online experiment.

  2. Sample size was determined based on 100 participants/condition; total N was reduced to 1,020 after cleaning the data for straight-liners. A post-hoc power analysis for a multiple linear regression indicated that a sample of 1,020 participants, with 17 predictors, an alpha of 0.05 and a conservative effect size (f = 0.02) yielded a statistical power of 0.81 [101].

  3. Following [91], a confirmatory factor analysis (CFA) was run for each previously validated scale among the dependent and independent variables. In some cases (for perceived robot-human likeness, extraversion, locus of control, and perceived technology competence), scales were modified slightly (e.g., 1–2 items removed) to improve the goodness of fit. The CFA for mediated view of robots revealed that a two-factor structure was a much better fit and so the measure was divided into two variables capturing negative and positive mediated views of robots separately.

References

  1. Giger JC, Piçarra N, Alves-Oliveira P et al (2019) Humanization of robots: is it really such a good idea? Hum Behav Emerg Technol 1(2):111–123. https://doi.org/10.1002/hbe2.147

    Article  Google Scholar 

  2. Breazeal C (2004) Designing Sociable Robots. MIT Press, Cambridge, MA

    Book  MATH  Google Scholar 

  3. Guzman A (2018a) What is human-machine communication, anyway? In: Guzman A (ed) Human-machine communication: rethinking communication, technology, and ourselves. Peter Lang, New York, pp 1–29

    Chapter  Google Scholar 

  4. Reeves B, Nass CI (1996) The media equation: how people treat computers, Television, and New Media like Real People and Places. Cambridge University Press, Cambridge, UK

    Google Scholar 

  5. Guzman A, Lewis SC (2020) Artificial intelligence and communication: a human–machine communication research agenda. New Media & Society 22(1):1–17. 10.1177/1461444819858691

    Article  Google Scholar 

  6. Appel J, von der Pütten A, Krämer NC et al (2012) Does humanity matter? Analyzing the importance of social cues and perceived agency of a computer system for the emergence of social reactions during human-computer interaction. Advances in Human-Computer Interaction. https://doi.org/10.1155/2012/324694

  7. Kidd C, Breazeal C (2005) Comparison of social presence in robots and animated characters. Interaction Journal Studies

  8. Hancock PA, Billings DR, Schaefer KE et al (2011) A meta-analysis of factors affecting trust in human-robot interaction. Hum Factors 53(5):517–527. https://doi.org/10.1177/0018720811417254

    Article  Google Scholar 

  9. Lee KM, Peng W, ** SA et al (2006) Can robots manifest personality? An empirical test of personality recognition, social responses, and social presence in human–robot interaction. J Communication 56(4):754–772. https://doi.org/10.1111/j.1460-2466.2006.00318.x

    Article  Google Scholar 

  10. Haslam N (2006) Dehumanization: an integrative review. Personality and Social Psychology Review 10(3):252–264. https://doi.org/10.1207/s15327957pspr1003_4

    Article  Google Scholar 

  11. Nomura T, Kanda T, Suzuki T (2006) Experimental investigation into influence of negative attitudes toward robots on human–robot interaction. AI Soc 20(2):138–150. https://doi.org/10.1007/s00146-005-0012-7

    Article  Google Scholar 

  12. Katz JE, Halpern D (2014) Attitudes towards robot’s suitability for various jobs as affected robot appearance. Behav Inform Technol 33(9):941–953. https://doi.org/10.1080/0144929X.2013.783115

    Article  Google Scholar 

  13. Coeckelbergh M (2010) Robot rights? Towards a social-relational justification of moral consideration. Ethics Inf Technol 12(3):209–221. https://doi.org/10.1007/s10676-010-9235-5

    Article  Google Scholar 

  14. Edwards A (2018) Animals, humans, and machines: interactive implications of ontological classification. In: Guzman A (ed) Human-machine communication: Rethinking Communication, Technology, and ourselves. Peter Lang, New York

    Google Scholar 

  15. Gunkel DJ (2018a) The other question: can and should robots have rights? Ethics Inf Technol 20(2):87–99. https://doi.org/10.1007/s10676-017-9442-4

    Article  Google Scholar 

  16. Giger JC, Piçarra N, Alves-Oliveira P, Oliveira R, Arriaga P (2019) Humanization of robots: is it really such a good idea? Hum Behav Emerg Technol 1(2):111–123. https://doi.org/10.1002/hbe2.147

    Article  Google Scholar 

  17. Araujo T (2018) Living up to the chatbot hype: the influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions. Comput Hum Behav 85:183–189. https://doi.org/10.1016/j.chb.2018.03.051

    Article  Google Scholar 

  18. Phillips E, Zhao X, Ullman D et al (2018) What is human-like?: decomposing robot human-like appearance using the anthropomorphic roBOT (ABOT) database. HRI ‘18: Proceedings of the Eleventh Annual ACM/IEEE International Conference on Human-Robot Interaction 105–133. https://doi.org/10.1145/3171221.3171268

  19. Chamorro-Premuzic R, Ahmetoglu G (2016) The pros and cons of robot managers. Harvard Business Rev, 12 December

  20. Mori M (1970) The uncanny valley. Energy 7(4):33–35

    Google Scholar 

  21. Wang S, Lilienfeld SO, Rochat P (2015) The uncanny valley: existence and explanations. Rev Gen Psychol 19(4):393–407. https://doi.org/10.1037/gpr0000056

    Article  Google Scholar 

  22. Yamada Y, Kawabe T, Ihaya K (2013) Categorization difficulty is associated with negative evaluation in the uncanny valley phenomenon. Jpn Psychol Res 55(1):20–32. https://doi.org/10.1111/j.1468-5884.2012.00538.x

    Article  Google Scholar 

  23. Ferrari F, Paladino MP, Jetten J (2016) Blurring human–machine distinctions: anthropomorphic appearance in social robots as a threat to human distinctiveness. Int J Social Robot 8(2):287–302. https://doi.org/10.1007/s12369-016-0338-y

    Article  Google Scholar 

  24. Gray K, Wegner DM (2012) Feeling robots and human zombies: mind perception and the uncanny valley. Cognition 125(1):125–130. https://doi.org/10.1016/j.cognition.2012.06.007

    Article  Google Scholar 

  25. Mays KK, Krongard S, Katz JE (2019) Robots revisited: Cyberdystopia, robotphobia, and social perceptions of robots in the evolving AI landscape. Presented at the Human Machine Communication (HMC) preconference at ICA 2019 in Washington, D.C

  26. Edwards A, Edwards C, Westerman D et al (2019) Initial expectations, interactions, and beyond with social robots. Comput Hum Behav 90:308–314. https://doi.org/10.1016/j.chb.2018.08.042

    Article  Google Scholar 

  27. Beraldo G, Di Battista S, Badaloni S et al (2018) Sex differences in expectations and perception of a social robot. In 2018 IEEE Workshop on Advanced Robotics and its Social Impacts (ARSO: 38–43). https://doi.org/10.1109/ARSO.2018.8625826

  28. Bernotat J, Eyssel F, Sachse J (2021) The (fe) male robot: how robot body shape impacts first impressions and trust towards robots. Int J Social Robot 13(3):477–489. https://doi.org/10.1007/s12369-019-00562-7

    Article  Google Scholar 

  29. Jung EH, Waddell TF, Sundar SS (2016) Feminizing robots: User responses to gender cues on robot body and screen. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems: 3107–3113. https://doi.org/10.1145/2851581.2892428

  30. Yu CE, Ngan HFB (2019) The power of head tilts: gender and cultural differences of perceived human vs human-like robot smile in service. Tourism Rev 74(3):428–442. https://doi.org/10.1108/TR-07-2018-0097

    Article  Google Scholar 

  31. Kraus M, Kraus J, Baumann M et al (2018) Effects of gender stereotypes on trust and likability in spoken human-robot interaction. In Proceedings of the eleventh international conference on language resources and evaluation (LREC 2018)

  32. Ghazali AS, Ham J, Barakova EI et al (2018) Effects of robot facial characteristics and gender in persuasive human-robot interaction. Front Rob AI 5:73. https://doi.org/10.3389/frobt.2018.00073

    Article  Google Scholar 

  33. Rogers K, Bryant DA, Howard A (2020) Robot gendering: Influences on trust, occupational competency, and preference of robot over human. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems: 1–7. https://doi.org/10.1145/3334480.3382930

  34. Bryant DA, Borenstein J, Howard A (2020) Why should we gender? The effect of robot gendering and occupational stereotypes on human trust and perceived competency. In Proceedings of the 2020 ACM/IEEE international conference on human-robot interaction: 13–21. https://doi.org/10.1145/3319502.3374778

  35. Reich-Stiebert N, Eyssel F (2017) (Ir) relevance of Gender? on the Influence of Gender Stereotypes on Learning with a Robot. In 2017 12th ACM/IEEE International Conference on Human-Robot Interaction: 166–176. https://doi.org/10.1145/2909824.3020242

  36. Nass C, Moon Y, Green N (1997) Are machines gender neutral? Gender-stereotypic responses to computers with voices. J Appl Soc Psychol 27(10):864–876. https://doi.org/10.1111/j.1559-1816.1997.tb00275.x

    Article  Google Scholar 

  37. Eyssel F, Hegel F (2012) (S) he’s got the look: gender stereoty** of robots. J Appl Soc Psychol 42(9):2213–2230. https://doi.org/10.1111/j.1559-1816.2012.00937.x

  38. Kuchenbrandt D, Häring M, Eichberg J et al (2014) Keep an eye on the task! How gender typicality of tasks influence human–robot interactions. Int J Social Robot 6(3):417–427. https://doi.org/10.1007/s12369-014-0244-0

    Article  Google Scholar 

  39. Otterbacher J, Talias M (2017) S/he’s too warm/agentic! The influence of gender on uncanny reactions to robots. In 2017 12th ACM/IEEE International Conference on Human-Robot Interaction: 214–223. https://doi.org/10.1145/2909824.3020220

  40. Appel M, Izydorczyk D, Weber S et al (2020) The uncanny of mind in a machine: humanoid robots as tools, agents, and experiencers. Comput Hum Behav 102:274–286. https://doi.org/10.1016/j.chb.2019.07.031

    Article  Google Scholar 

  41. Crowell CR, Villanoy M, Scheutzz M et al (2009) Gendered voice and robot entities: perceptions and reactions of male and female subjects. In 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems: 3735–3741). https://doi.org/10.1109/IROS.2009.5354204

  42. Eyssel F, Kuchenbrandt D (2012) Social categorization of social robots: Anthropomorphism as a function of robot group membership. Br J Soc Psychol 51(4):724–731. https://doi.org/10.1111/j.2044-8309.2011.02082.x

    Article  Google Scholar 

  43. Gunkel DJ (2018b) The relational turn: third wave HCI and phenomenology. In: Filimowicz M, Tzankova V (eds) New directions in third Wave Human-Computer Interaction: volume 1-Technologies. Springer, Cham, pp 11–24. https://doi.org/10.1007/978-3-319-73356-2_2

    Chapter  Google Scholar 

  44. Taipale S, Fortunati L (2018) Communicating with machines: Robots as the next new media. In: Guzman A (ed) Human-machine communication: rethinking communication, technology, and ourselves. Peter Lang, New York, pp 201–220

    Google Scholar 

  45. Dautenhahn K, Woods S, Kaouri C et al (2005) What is a robot companion? – Friend, assistant, or butler? Proceedings of the IEEE/Robotics Society of Japan International Conference on Intelligent Robots and Systems: 1488–1493. https://doi.org/10.1109/IROS.2005.1545189

  46. Takayama L, Ju W, Nass C (2008) Beyond dirty, dangerous and dull: What everyday people think robots should do. Proceedings of 3rd ACM / IEEE international conference on human robot interaction: 25–32. https://doi.org/10.1145/1349822.1349827

  47. Kim Y, Mutlu B (2014) How social distance shapes human–robot interaction. Int J Hum Comput Stud 72(12):783–795. https://doi.org/10.1016/j.ijhcs.2014.05.005

    Article  Google Scholar 

  48. Hinds PJ, Roberts TL, Jones H (2004) Whose job is it anyway? A study of human–robot interaction in a collaborative task. Human–Computer Interact 19(1):151–181. https://doi.org/10.1080/07370024.2004.9667343

    Article  Google Scholar 

  49. Kwak SS, Kim Y, Kim E et al (2013) What makes people empathize with an emotional robot? The impact of agency and physical embodiment on human empathy for a robot. 2013 IEEE RO- MAN:180–185. https://doi.org/10.1109/ROMAN.2013.6628441

    Article  Google Scholar 

  50. Eisenberg N, Eggum ND, Di Giunta L (2010) Empathy-related responding: Associations with prosocial behavior, aggression, and intergroup relations. Social Issues and Policy Review 4(1):143–180. 10.1111/j.1751-2409.2010.01020.x

    Article  Google Scholar 

  51. Goetz J, Kiesler S, Powers A (2003) Matching robot appearance and behavior to tasks to improve human-robot cooperation. The 12th IEEE International Workshop on Robot and Human Interactive Communication Proceedings: 55–60. https://doi.org/10.1109/ROMAN.2003.1251796

  52. Eagly AH, Wood W (2016) Social role theory of sex differences. The Wiley Blackwell Encyclopedia of Gender and Sexuality Studies : 1–3

  53. Spence PR, Westerman D, Lin X (2018) A robot will take your job. How does that make you feel? In: Guzman S (ed) Human-machine communication: rethinking communication, technology, and ourselves. Peter Lang, New York, pp 185–200

    Google Scholar 

  54. Guzman A (2018b) Beyond extraordinary: theorizing artificial intelligence and the self in daily life. In: Papacharissi Z (ed) A networked self and human augmentics, Artificial Intelligence, Sentience. Routledge, New York, NY. https://doi.org/10.4324/9781315202082-7

    Chapter  Google Scholar 

  55. Lombard M, Xu K (2021) Social responses to media technologies in the 21st century: the media are social actors paradigm. Human-Machine Communication 2:29–55. https://doi.org/10.30658/hmc.2.2

    Article  Google Scholar 

  56. Fischer K (2011) Interpersonal variation in understanding robots as social actors. 2011 6th ACM/IEEE International Conference on Human-Robot Interaction: 53–60. https://doi.org/10.1145/1957656.1957672

  57. MacDorman KF, Entezari SO (2015) Individual differences predict sensitivity to the uncanny valley. Interact Stud 16(2):141–172. https://doi.org/10.1075/is.16.2.01mac

    Article  Google Scholar 

  58. Rosen LD, Sears DC, Weil MM (1993) Treating technophobia: a longitudinal evaluation of the computerphobia reduction program. Comput Hum Behav 9(1):27–50. https://doi.org/10.1016/0747-5632(93)90019-O

    Article  Google Scholar 

  59. Orr C, Allen D, Poindexter S (2001) The effect of individual differences on computer attitudes: an empirical study. J Organizational End User Comput 13(2):26–39. https://doi.org/10.4018/joeuc.2001040103

    Article  Google Scholar 

  60. Teo T, Noyes J (2014) Explaining the intention to use technology among pre-service teachers: a multi-group analysis of the Unified Theory of Acceptance and Use of Technology. Interact Learn Environ 22(1):51–66. https://doi.org/10.1080/10494820.2011.641674

    Article  Google Scholar 

  61. Saadé RG, Kira D (2007) Mediating the impact of technology usage on perceived ease of use by anxiety. Comput Educ 49(4):1189–1204. https://doi.org/10.1016/j.compedu.2006.01.009

    Article  Google Scholar 

  62. Davis FD (1989) Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q 13(3):319–340. https://doi.org/10.2307/249008

    Article  Google Scholar 

  63. Nomura T, Suzuki T, Kanda T, Kato K (2006) Measurement of anxiety toward robots. ROMAN 2006-The 15th IEEE International Symposium on Robot and Human Interactive Communication: 372–377. https://doi.org/10.1109/ROMAN.2006.314462

  64. Sundar SS, Waddell TF, Jung EH (2016) The Hollywood Robot Syndrome media effects on older adults’ attitudes toward robots and adoption intentions. 2016 11th ACM/IEEE International Conference on Human-Robot Interaction: pp. 343–350. https://doi.org/10.1109/HRI.2016.7451771

  65. Banks J (2020) Optimus primed: media cultivation of robot mental models and social judgments. Front Rob AI 7:62. https://doi.org/10.3389/frobt.2020.00062

    Article  Google Scholar 

  66. Horstmann AC, Krämer NC (2019) Great expectations? Relation of previous experiences with social robots in real life or in the media and expectancies based on qualitative and quantitative assessment. Front Psychol 10. https://doi.org/10.3389/fpsyg.2019.00939

  67. de Graaf MM, Allouch SB (2013) Exploring influencing variables for the acceptance of social robots. Robot Auton Syst 61(12):1476–1486. https://doi.org/10.1016/j.robot.2013.07.007

    Article  Google Scholar 

  68. Schermerhornz P, Scheutz M, Crowell CR (2008) Robot social presence and gender: Do females view robots differently than males? Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction: 263–270. https://doi.org/10.1145/1349822.1349857

  69. Katz JE, Halpern D, Crocker ET (2015) In the company of robots: views of acceptability of robots in social settings. In: Vincent J et al (eds) Social Robots from a human perspective. Springer, Cham, pp 25–38. https://doi.org/10.1007/978-3-319-15672-9_3

    Chapter  Google Scholar 

  70. Heerink M (2011) Exploring the influence of age, gender, education and computer experience on robot acceptance by older adults. In 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI): 147–148. https://doi.org/10.1145/1957656.1957704

  71. Nomura T (2017) Robots and gender. Gender and the genome. 1(1):18–25. https://doi.org/10.1089/gg.2016.29002.nom

  72. Siegel M, Breazeal C, Norton M (2009) Persuasive robotics: The influence of robot gender on human behavior. 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems: 2563–2568. https://doi.org/10.1109/IROS.2009.5354116

  73. Edwards C, Edwards A, Stoll B et al (2019) Evaluations of an artificial intelligence instructor’s voice: Social Identity Theory in human-robot interactions. Comput Hum Behav 90:357–362. https://doi.org/10.1016/j.chb.2018.08.027

    Article  Google Scholar 

  74. Horstmann AC, Krämer NC (2019) Great expectations? Relation of previous experiences with social robots in real life or in the media and expectancies based on qualitative and quantitative assessment. Front Psychol 10. https://doi.org/10.3389/fpsyg.2019.00939

  75. Venkatesh V (2000) Determinants of perceived ease of use: integrating control, intrinsic motivation, and emotion into the technology acceptance model. Inform Syst Res 11(4):342–365. https://doi.org/10.1287/isre.11.4.342.11872

    Article  Google Scholar 

  76. Rotter JB (1966) Generalized expectancies for internal versus external control of reinforcement. Psychol Monographs: Gen Appl 80(1):1–28. https://doi.org/10.1037/h0092976

    Article  Google Scholar 

  77. Hsia JW, Chang CC, Tseng AH (2014) Effects of individuals’ locus of control and computer self-efficacy on their e-learning acceptance in high-tech companies. Behav Inform Technol 33(1):51–64. https://doi.org/10.1080/0144929X.2012.702284

    Article  Google Scholar 

  78. Hsia JW (2016) The effects of locus of control on university students’ mobile learning adoption. J Comput High Educ 28(1):1–17. https://doi.org/10.1007/s12528-015-9103-8

    Article  Google Scholar 

  79. Lida BL, Chaparro BS (2002) Using the locus of control personality dimension as a predictor of online behavior. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 46(14): 1286–1290. 10.1177/154193120204601410

  80. Fong LHN, Lam LW, Law R (2017) How locus of control shapes intention to reuse mobile apps for making hotel reservations: evidence from chinese consumers. Tour Manag 61:331–342. https://doi.org/10.1016/j.tourman.2017.03.002

    Article  Google Scholar 

  81. Coovert MD, Goldstein M (1980) Locus of control as a predictor of users’ attitude toward computers. Psychol Rep 47:1167–1173. https://doi.org/10.2466/pr0.1980.47.3f.1167

    Article  Google Scholar 

  82. Crable EA, Brodzinski JD, Scherer RF et al (1994) The impact of cognitive appraisal, locus of control, and level of exposure on the computer anxiety of novice computer users. J Educational Comput Res 10(4):329–340 DOI: 10.2190/K2YH-MMJV-GBBL-YTTU

    Article  Google Scholar 

  83. Mays KK, Lei Y, Giovanetti R, Katz JE (2022) AI as a boss? A national US survey of predispositions governing comfort with expanded AI roles in society. AI Soc 37(4):1587–1600

    Article  Google Scholar 

  84. Robert L (2018) Personality in the human robot interaction literature: A review and brief critique. Proceedings of the 24th Americas Conference on Information Systems: 16–18

  85. Salem M, Lakatos G, Amirabdollahian F et al (2015) Would you trust a (faulty) robot? Effects of error, task type and personality on human-robot cooperation and trust. 10th ACM/IEEE International Conference on Human-Robot Interaction: 1–8. https://doi.org/10.1145/2696454.2696497

  86. Damholdt MF, Nørskov M, Yamazaki R et al (2015) Attitudinal change in elderly citizens toward social robots: the role of personality traits and beliefs about robot functionality. Front Psychol 6:1701. https://doi.org/10.3389/fpsyg.2015.01701

    Article  Google Scholar 

  87. MacDorman KF (2006) Subjective ratings of robot video clips for human likeness, familiarity, and eeriness: An exploration of the uncanny valley. In ICCS/CogSci-2006 Long Symposium: Toward Social Mechanisms of Android Science: 26–29

  88. Bartneck C, Kulić D, Croft E et al (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J Social Robot 1(1):71–81. https://doi.org/10.1007/s12369-008-0001-3

    Article  Google Scholar 

  89. Ho CC, MacDorman KF (2010) Revisiting the uncanny valley theory: develo** and validating an alternative to the Godspeed indices. Comput Hum Behav 26(6):1508–1518. https://doi.org/10.1016/j.chb.2010.05.015

    Article  Google Scholar 

  90. Złotowski J, Yogeeswaran K, Bartneck C (2017) Can we control it? Autonomous robots threaten human identity, uniqueness, safety, and resources. Int J Hum Comput Stud 100:48–54. https://doi.org/10.1016/j.ijhcs.2016.12.008

    Article  Google Scholar 

  91. Levine T, Hullett CR, Turner MM, Lapinski MK (2006) The desirability of using confirmatory factor analysis on published scales. Communication Res Rep 23(4):309–314. https://doi.org/10.1080/08824090600962698

  92. Ashrafian H (2015) Artificial intelligence and robot responsibilities: innovating beyond rights. Sci Eng Ethics 21(2):317–326. https://doi.org/10.1007/s11948-014-9541-0

    Article  Google Scholar 

  93. Eysenck SBG, Eysenck HJ, Barrett P (1985) A revised version of the psychoticism scale. Pers Indiv Differ 6(1):21–29. https://doi.org/10.1016/0191-8869(85)90026-1

    Article  Google Scholar 

  94. Katz JE, Aspden P, Reich WA (1997) Public attitudes toward voice-based electronic messaging technologies in the United States: a national survey of opinions about voice response units and telephone answering machines. Behav Inform Technol 16(3):125–144. https://doi.org/10.1080/014492997119860

    Article  Google Scholar 

  95. Gambino A, Fox J, Ratan RA (2020) Building a stronger CASA: extending the Computers are Social Actors paradigm. Human-Machine Communication 1:71–86. https://doi.org/10.30658/hmc.1.5

    Article  Google Scholar 

  96. Edwards C, Edwards A, Spence PR et al (2016) Initial interaction expectations with robots: testing the human-to-human interaction script. Communication Stud 67(2):227–238. https://doi.org/10.1080/10510974.2015.1121899

    Article  Google Scholar 

  97. Sundar SS (2020) Rise of machine agency: a framework for studying the psychology of Human–AI Interaction (HAII). J Computer-Mediated Communication 25(1):74–88. https://doi.org/10.1093/jcmc/zmz026

    Article  MathSciNet  Google Scholar 

  98. Lima G, Kim C, Ryu S et al (2020) Collecting the public perception of AI and robot rights. Proceedings of the ACM on Human-Computer Interaction, 4(CSCW2): 1–24. https://doi.org/10.1145/3415206

  99. Waytz A, Heafner J, Epley N (2014) The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. J Exp Soc Psychol 52:113–117. https://doi.org/10.1016/j.jesp.2014.01.005

    Article  Google Scholar 

  100. Waytz A, Cacioppo J, Epley N (2010) Who sees human? The stability and importance of individual differences in anthropomorphism. Perspect Psychol Sci 5(3):219–232. https://doi.org/10.1177/1745691610369336

    Article  Google Scholar 

  101. Faul F, Erdfelder E, Buchner A, Lang A-G (2009) Statistical power analyzes using G*Power 3.1: tests for correlation and regression analyses. Behav Res Methods 41:1149–1160

    Article  Google Scholar 

Download references

Funding

No external funding was received. This study was supported by the Division of Emerging Media Studies at Boston University through a Feld Research Grant.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study conception and design. Material preparation, data collection, data analysis, and most manuscript writing were performed by KKM. Data collection, significant review and manuscript editing were performed by JC. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Kate K. Mays.

Ethics declarations

Conflicts of Interest

Authors have no conflicts of interest to report.

Ethics Approval

Study was reviewed by Boston University’s Charles River Campus Institutional Review Board and deemed to meet the criteria for exemption in accordance with CFR 46.101(b)(2)(i) on March 13, 2020, IRB #5477X.

Consent to Participate

Informed consent was obtained for all participants in the study.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mays, K.K., Cummings, J.J. The Power of Personal Ontologies: Individual Traits Prevail Over Robot Traits in Sha** Robot Humanization Perceptions. Int J of Soc Robotics 15, 1665–1682 (2023). https://doi.org/10.1007/s12369-023-01045-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-023-01045-6

Keywords

Navigation