Who Do We Trust? Attachment, Anthropomorphism, and Age in Human–AI Relations
Abstract
This study examined age-related differences in attachment styles, anthropomorphism, and trust in artificial intelligence. The primary aim was to examine emotional bonds and perceptions of AI systems change among adults at different developmental levels. A total of 92 participants aged 16 to 57 completed an online survey assessing attachment patterns, anthropomorphic perceptions, and trust in AI. Results indicated that younger adults demonstrated significantly higher levels of preoccupied (anxious) attachment compared to older groups. Trust in AI differed significantly across age groups, with middle-aged adults reporting the highest trust. Anthropomorphism did not significantly vary by age. Regression analysis showed that attachment styles did not significantly predict anthropomorphism overall, although dismissive attachment showed a modest individual association that should be interpreted cautiously due to the nonsignificant overall model. These findings suggest developmental differences in attachment anxiety and trust in AI, while anthropomorphic tendencies appear relatively stable across adulthood. However, given the modest sample size and cross-sectional design, findings should be interpreted cautiously and warrant replication in larger samples.
References
Bartholomew, K., & Horowitz, L. M. (1991). Attachment styles among young adults: A test of a four-category model. Journal of Personality and Social Psychology, 61(2), 226–244. https://doi.org/10.1037/0022-3514.61.2.226
Bosmans, G., Waters, T. E. A., Finet, C., De Winter, S., & Hermans, D. (2019). Trust development as an expectancy-learning process: Testing contingency effects. PLoS ONE, 14(12), e0225934. https://doi.org/10.1371/journal.pone.0225934
Charles, S. T., & Carstensen, L. L. (2010). Social and emotional aging. Annual Review of Psychology, 61(1), 383–409. https://doi.org/10.1146/annurev.psych.093008.100448
Chin, M. G., Yordon, R. E., Clark, B. R., Ballion, T., Dolezal, M. J., Shumaker, R., & Finkelstein, N. (2005). Developing and anthropomorphic tendencies scale. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 49(13), 1266–1268. https://doi.org/10.1177/154193120504901311
Chopik, W. J., Edelstein, R. S., & Fraley, R. C. (2012). From the cradle to the grave: Age differences in attachment from early adulthood to old age. Journal of Personality, 81(2), 171–183. https://doi.org/10.1111/j.1467-6494.2012.00793.x
Chopik, W. J., Edelstein, R. S., & Grimm, K. J. (2019). Longitudinal changes in attachment orientation over a 59-year period. Journal of personality and social psychology, 116(4), 598–611. https://doi.org/10.1037/pspp0000167
David, J., Stowe, M., Caruana, N., & Norberg, M. M. (2025). The 6-item specific object anthropomorphism scale: a new questionnaire for children and adults. PeerJ, 13, e20153. https://doi.org/10.7717/peerj.20153
de Visser, E. J., Pak, R., & Shaw, T. H. (2018). From ‘automation’ to ‘autonomy’: the importance of trust repair in human–machine interaction. Ergonomics, 61(10), 1409–1427. https://doi.org/10.1080/00140139.2018.1457725
Deng, Z., & Yan, J. (2025). The effect of perceived warmth, competence, and social presence of AI-Driven Chabots on consumers’ engagement and satisfaction. SAGE Open, 15(3). https://doi.org/10.1177/21582440251365438
Frazier, M. L., Johnson, P. D., & Fainshmidt, S. (2013). Development and validation of a propensity to trust scale. Journal of Trust Research, 3(2), 76–97. https://doi.org/10.1080/21515581.2013.820026
Gillath, O., Ai, T., Branicky, M. S., Keshmiri, S., Davison, R. B., & Spaulding, R. (2020). Attachment and trust in artificial intelligence. Computers in Human Behavior, 115, 106607. https://doi.org/10.1016/j.chb.2020.106607
Gillespie, N., Lockey, S., Ward, T., Macdade, A., Hassed, G., University of Melbourne, & KPMG International. (2025). Trust, attitudes and use of artificial intelligence: A global study 2025. The University of Melbourne and KPMG. https://doi.org/10.26188/28822919
Hancock, P. A., Kessler, T. T., Kaplan, A. D., Brill, J. C., & Szalma, J. L. (2020). Evolving trust in Robots: Specification through Sequential and Comparative Meta-Analyses. Human Factors the Journal of the Human Factors and Ergonomics Society, 63(7), 1196–1229. https://doi.org/10.1177/0018720820922080
Heng, S., & Zhang, Z. (2025). Attachment anxiety and problematic use of conversational artificial intelligence: mediation of emotional attachment and moderation of anthropomorphic tendencies. Psychology Research and Behavior Management, Volume 18, 1775–1785. https://doi.org/10.2147/prbm.s531805
Hong, Y., Lian, J., Xu, L., Min, J., Wang, Y., Freeman, L. J., & Deng, X. (2022). Statistical perspectives on reliability of artificial intelligence systems. Quality Engineering, 35(1), 56–78. https://doi.org/10.1080/08982112.2022.2089854
Jian, J., Bisantz, A. M., & Drury, C. G. (2000). Foundations for an empirically determined scale of trust in automated systems. International Journal of Cognitive Ergonomics, 4(1), 53–71. https://doi.org/10.1207/s15327566ijce0401_04
Kubovics, M. (2025). The Impact of Age Groups' Attitudes Towards Artificial Intelligence and Data Protection. International Journal of Business and Economics Research, 14(3), 99–108. https://doi.org/10.11648/j.ijber.20251403.13
Kullgren, J., Solway, E., Roberts, S., Brewer, R., Singer, D., Kirch, M., Box, N., Strunk, S., & Smith, E. (2025). National Poll on Healthy Aging: How Older adults use and think about AI. Deep Blue (University of Michigan). https://doi.org/10.7302/26593
Körber, M. (2018). Theoretical considerations and development of a questionnaire to measure trust in automation. In Advances in intelligent systems and computing (pp. 13–30). https://doi.org/10.1007/978-3-319-96074-6_2
Matkin, G. S., Headrick, J., & Sunderman, H. (2023, June 1). Developing trust & being trustworthy. Developing Human Potential. https://pressbooks.nebraska.edu/developinghumanpotential/chapter/developing-trust-being-trustworthy/
Min, J., Hong, Y., King, C. B., & Meeker, W. Q. (2022). Reliability Analysis of Artificial Intelligence Systems Using Recurrent Events Data from Autonomous Vehicles. Journal of the Royal Statistical Society Series C (Applied Statistics), 71(4), 987–1013. https://doi.org/10.1111/rssc.12564
Morillo-Mendez, L., Schrooten, M. G. S., Loutfi, A., & Mozos, O. M. (2022). Age-Related differences in the perception of robotic referential gaze in Human-Robot interaction. International Journal of Social Robotics, 16(6), 1069–1081. https://doi.org/10.1007/s12369-022-00926-6
Nass, C., & Moon, Y. (2000). Machines and Mindlessness: Social Responses to Computers. Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153
Platts, L. G., Norbrian, A. A., & Frick, M. A. (2022). Attachment in older adults is stably associated with health and quality of life: findings from a 14-year follow-up of the Whitehall II study. Aging & Mental Health, 27(9), 1832–1842. https://doi.org/10.1080/13607863.2022.2148157
Polek, E. (2008). Attachment in cultural context: Differences in attachment between Eastern and Western Europeans. The University of Groningen Research Portal. https://research.rug.nl/en/publications/attachment-in-cultural-context-differences-in-attachment-between-/
Ren, K., Lin, Y., & Gunderson, E. A. (2019). The role of inhibitory control in strategy change: The case of linear measurement. Developmental Psychology, 55(7), 1389–1399. https://doi.org/10.1037/dev0000739
Rotter, J. B. (1967). Interpersonal Trust scale [Dataset]. In PsycTESTS Dataset. https://doi.org/10.1037/t02271-000
Roundtable on Public Interfaces of the Life Sciences, Board on Life Sciences, Division on Earth and Life Studies, Board on Science Education, Division of Behavioral and Social Sciences and Education, & The National Academies of Sciences, Engineering, and Medicine. (2015). Trust and Confidence at the Interfaces of the Life Sciences and Society: Does the Public Trust Science?. National Academies Press (US). https://doi.org/10.17226/21798
Ryan, M. (2020). In AI we trust: ethics, artificial intelligence, and reliability. Science and Engineering Ethics, 26(5), 2749–2767. https://doi.org/10.1007/s11948-020-00228-y
Sagone, E., Commodari, E., Indiana, M. L., & La Rosa, V. L. (2023). Exploring the Association between Attachment Style, Psychological Well-Being, and Relationship Status in Young Adults and Adults—A Cross-Sectional Study. European Journal of Investigation in Health Psychology and Education, 13(3), 525–539. https://doi.org/10.3390/ejihpe13030040
Segal, D. L., Needham, T. N., & Coolidge, F. L. (2009). Age Differences in Attachment Orientations among Younger and Older Adults: Evidence from Two Self-Report Measures of Attachment. The International Journal of Aging and Human Development, 69(2), 119–132. https://doi.org/10.2190/ag.69.2.c
Spatola, N., Marchesi, S., & Wykowska, A. (2022). Different models of anthropomorphism across cultures and ontological limits in current frameworks the integrative framework of anthropomorphism. Frontiers in Robotics and AI, 9, 863319. https://doi.org/10.3389/frobt.2022.863319
Stockton, S. (2023). Dunnhumby Report: One In Five U.S. Consumers Trust AI. The Shelby Report. https://theshelbyreport.com/2023/11/02/dunnhumby-report-one-in-five-u-s-consumers-trust-ai/
Waytz, A., Cacioppo, J., & Epley, N. (2010). Who sees human? Perspectives on Psychological Science, 5(3), 219–232. https://doi.org/10.1177/1745691610369336
Yang, F., & Oshio, A. (2025). Using attachment theory to conceptualize and measure the experiences in human-AI relationships. Current Psychology, 44(11), 10658–10669. https://doi.org/10.1007/s12144-025-07917-6
Zimmerman, A., Janhonen, J., & Beer, E. (2023). Human/AI relationships: challenges, downsides, and impacts on human/human relationships. AI And Ethics, 4(4), 1555–1567. https://doi.org/10.1007/s43681-023-00348-8