Gender Differences in Human Trust and Contextual Perception of Robots

Authors

  • Yang Rao Zhongkai University of Agriculture and Engineering image/svg+xml
  • Ruquan Yang Zhongkai University of Agriculture and Engineering image/svg+xml
  • Shuangyi He Zhongkai University of Agriculture and Engineering image/svg+xml
  • Wenyi Lin Zhongkai University of Agriculture and Engineering image/svg+xml
  • Ruizhao Cai People's Government of Jiangdong Town , Jiangdong Town Industrial Development Service Center

DOI:

https://doi.org/10.4108/eetpht.11.11058

Keywords:

Gender Differences, Robot Trust, Perceived Warmth, Perceived Threat, Vignette-Based Experiment, Human-Robot Interaction

Abstract

INTRODUCTION: Trust is a prerequisite for safe and effective Human–Robot Interaction (HRI), yet reported gender differences are inconsistent and likely contingent on context and socio-perceptual processes.

OBJECTIVES: Within a unified framework spanning four canonical HRI contexts (healthcare, education, manufacturing, security), we test whether (a) gender predicts trust, (b) context moderates gender effects, and (c) perceived warmth and perceived threat mediate gender–trust relations.

METHODS: A vignette-based experiment with adults (N = 132; male/female) measured affective and cognitive trust, perceived warmth, and perceived threat on 7-point scales. Analyses followed a preregistered plan: 2×4 mixed ANOVAs (Gender × Context) and parallel mediation (PROCESS Model 4; 5,000 bootstrap resamples) with covariates (age, education, prior HRI experience).

RESULTS: Gender showed a significant main effect for affective trust (females > males), but not for cognitive trust. Context effects were significant for both trust facets. Gender × Context interactions emerged: the female advantage in affective trust was concentrated in healthcare, while males reported higher cognitive trust in education and manufacturing. Mediation indicated that females’ higher perceived warmth and lower perceived threat jointly accounted for gender differences in overall trust; the direct gender effect was not significant after including mediators. Robustness checks (ANCOVAs; order effects) supported all primary findings.

CONCLUSION: Gender differences in robot trust are context-dependent and arise via warmth-enhancing and threat-reducing socio-perceptual pathways. Design should emphasize empathy/assurance cues in caring roles and competence/reliability cues in task/authority roles, alongside systematic threat mitigation.

Downloads

Download data is not yet available.

References

[1] Li J, Wan R. Research on Ethical Design for Silicon-Based Life Forms. Philosopher’s Compass. 2024 Oct 1;1(1):10-7.

[2] Christoforakos L, Gallucci A, Surmava-Große T, Ullrich D, Diefenbach S. Can robots earn our trust the same way humans do? A systematic exploration of competence, warmth, and anthropomorphism as determinants of trust development in HRI. Frontiers in Robotics and AI. 2021;8:640444. doi:10.3389/frobt.2021.640444.

[3] Law T, Chita-Tegmark M, Scheutz M. The interplay between emotional intelligence, trust, and gender in human–robot interaction: A vignette-based study. International Journal of Social Robotics. 2021;13(2):297–309. doi:10.1007/s12369-020-00624-1.

[4] de Souza DF, Sousa S, Kristjuhan-Ling K, Dunajeva O, Roosileht M, Pentel A, Mõttus M, Özdemir MC, Gratšjova Ž. Trust and trustworthiness from a human-centered perspective in HRI: A systematic literature review. arXiv [cs.RO]. 2025;arXiv:2501.19323.

[5] Muir BM. Trust in automation: Part I. Theoretical issues in the study of trust and human intervention in automated systems. Ergonomics. 1994;37(11):1905–1922. doi:10.1080/00140139408964957.

[6] Carragher DJ, Sturman D, Hancock PJB. Trust in automation and the accuracy of human–algorithm teams performing one-to-one face matching tasks. Cognitive Research: Principles and Implications. 2024;9:41. doi:10.1186/s41235-024-00564-8.

[7] Jian JY, Bisantz AM, Drury CG. Foundations for an empirically determined scale of trust in automated systems. International Journal of Cognitive Ergonomics. 2000;4(1):53–71. doi:10.1207/S15327566IJCE0401_04.

[8] Kok BC, Soh H. Trust in robots: Challenges and opportunities. Current Robotics Reports. 2020;1(4):297–309. doi:10.1007/s43154-020-00029-y.

[9] Rossi A, Holthaus P, Perugia G, Moros S, Scheunemann M. Trust, acceptance and social cues in human–robot interaction (SCRITA). International Journal of Social Robotics. 2021;13(8):1833–1834. doi:10.1007/s12369-021-00844-z.

[10] Roesler E, Vollmann M, Manzey D, Onnasch L. The dynamics of human–robot trust attitude and behavior-Exploring the effects of anthropomorphism and type of failure. Computers in Human Behavior. 2024;150:108008. doi:10.1016/j.chb.2023.108008.

[11] Carpinella CM, Wyman AB, Perez MA, Stroessner SJ. The Robotic Social Attributes Scale (RoSAS): Development and validation. In: Proceedings of the 2017 ACM/IEEE International Conference on Human–Robot Interaction (HRI 2017); 2017 Mar 6–9; Vienna, Austria. New York: ACM; 2017. p. 254–262. doi:10.1145/2909824.3020208.

[12] Neuenswander KL, Dash A, Koya PD, Lin L, Gillespie GSR, Stroessner SJ. Measuring fundamental aspects of the social perception of robots: Development and validation of a shortened version of the RoSAS (RoSAS-SF). International Journal of Social Robotics. 2025;17(6):1097–1112. doi:10.1007/s12369-025-01251-4.

[13] Boyapati YM, Khan A. Gender differences in robot acceptance. In: Health Informatics and Medical Systems and Biomedical Engineering (CSCE 2024). Cham: Springer; 2025. (Communications in Computer and Information Science; 2259). p. 351–361. doi:10.1007/978-3-031-85908-3_28.

[14] Lim WM, Jasim KM, Malathi A. Service robots in healthcare: Toward a healthcare service robot acceptance model (sRAM). Technology in Society. 2025;82:102932. doi:10.1016/j.techsoc.2025.102932.

[15] Pietrantoni L, Favilla M, Fraboni F, Mazzoni E, Morandini S, Benvenuti M, De Angelis M. Integrating collaborative robots in manufacturing, logistics, and agriculture: Expert perspectives on technical, safety, and human factors. Frontiers in Robotics and AI. 2024;11:1342130. doi:10.3389/frobt.2024.1342130.

[16] Nazaretsky T, Mejia-Domenzain P, Swamy V, Frej J, Käser T. The critical role of trust in adopting AI-powered educational technology for learning: An instrument for measuring student perceptions. Computers and Education: Artificial Intelligence. 2025;8:100368. doi:10.1016/j.caeai.2025.100368.

[17] Shidujaman M, Samani H. Creating trustworthy patrol robot with an ethical design approach. In: Proceedings of the 2024 2nd International Conference on Robotics, Control and Vision Engineering (RCVE 2024); 2024 Jul 19–21; Hong Kong, China. New York: ACM; 2024. p. 24–29. doi:10.1145/3685073.3685078.

[18] Dhanda M, Rogers BA, Hall S, Dekoninck E, Dhokia V. Reviewing human-robot collaboration in manufacturing: Opportunities and challenges in the context of Industry 5.0. Robotics and Computer-Integrated Manufacturing. 2025;93:102937. doi:10.1016/j.rcim.2024.102937.

[19] Nass C, Moon Y. Machines and mindlessness: Social responses to computers. Journal of social issues. 2000;56(1):81-103.

[20] Patel J, Sonar P, Pinciroli C. On multi-human multi-robot remote interaction: a study of transparency, inter-human communication, and information loss in remote interaction. Swarm Intelligence. 2022 Jun;16(2):107-42. doi:10.1007/s11721-022-00241-x.

[21] De Simone V, Di Pasquale V, Giubileo V, Miranda S. Human-Robot Collaboration: An analysis of worker’s performance. Procedia Computer Science. 2022 Jan 1;200:1540-9. doi:10.1016/j.procs.2022.01.218.

[22] Schaefer KE, Billings DR, Szalma JL, Adams JK, Sanders TL, Chen JY, Hancock PA. A meta-analysis of factors influencing the development of trust in automation: Implications for human-robot interaction. Human Factors. 2014 Jul 1;56(3):529-55. doi:10.1177/0018720814539763.

Downloads

Published

19-01-2026

How to Cite

1.
Rao Y, Yang R, He S, Lin W, Cai R. Gender Differences in Human Trust and Contextual Perception of Robots. EAI Endorsed Trans Perv Health Tech [Internet]. 2026 Jan. 19 [cited 2026 Jan. 20];11. Available from: https://publications.eai.eu/index.php/phat/article/view/11058