*Result*: HR-ACT (Human-Robot Action) Database: Communicative and noncommunicative action videos featuring a human and a humanoid robot.
Original Publication: Austin, Tex. : Psychonomic Society, c2005-
Sci Rep. 2020 Oct 9;10(1):16928. (PMID: 33037260)
Cognition. 2012 Oct;125(1):125-30. (PMID: 22784682)
Curr Biol. 2004 Jan 20;14(2):117-20. (PMID: 14738732)
Neuropsychologia. 2018 Jun;114:181-185. (PMID: 29704523)
Trends Cogn Sci. 2010 Aug;14(8):383-8. (PMID: 20579932)
Perception. 2004;33(2):217-36. (PMID: 15109163)
Neuroimage. 2007 May 1;35(4):1674-84. (PMID: 17395490)
Behav Res Methods. 2013 Jun;45(2):319-28. (PMID: 23073730)
Cogn Process. 2007 Sep;8(3):159-66. (PMID: 17429704)
Behav Res Methods. 2010 Feb;42(1):168-78. (PMID: 20160297)
Soc Cogn Affect Neurosci. 2012 Apr;7(4):413-22. (PMID: 21515639)
Trends Cogn Sci. 2021 Mar;25(3):200-212. (PMID: 33384213)
J Clin Exp Neuropsychol. 2009 Aug;31(6):731-53. (PMID: 19105074)
Hum Brain Mapp. 2015 Oct;36(10):3845-66. (PMID: 26129732)
J Cogn Neurosci. 2019 Mar;31(3):327-338. (PMID: 29916793)
Cereb Cortex. 2013 Nov;23(11):2734-53. (PMID: 22918981)
Trends Cogn Sci. 2021 Jun;25(6):493-505. (PMID: 33745819)
Annu Rev Psychol. 2007;58:47-73. (PMID: 16903802)
*Further Information*
*We present the HR-ACT (Human-Robot Action) Database, a comprehensive collection of 80 standardized videos featuring matched communicative and noncommunicative actions performed by both a humanoid robot (Pepper) and a human actor. We describe the creation of 40 action exemplars per agent, with actions executed in a similar manner, timing, and number of repetitions. The database includes detailed normative data collected from 438 participants, providing metrics on action identification, confidence ratings, communicativeness ratings, meaning clusters, and H values (an entropy-based measure reflecting response homogeneity). We provide researchers with controlled yet naturalistic stimuli in multiple formats: videos, image frames, and raw animation files (.qanim). These materials support diverse research applications in human-robot interaction, cognitive psychology, and neuroscience. The database enables systematic investigation of action perception across human and robotic agents, while the inclusion of raw animation files allows researchers using Pepper robots to implement these actions for real-time experiments. The full set of stimuli, along with comprehensive normative data and documentation, is publicly available at https://osf.io/8vsxq/ .
(© 2026. The Author(s).)*
*Declarations. Ethical Approval: The study was approved by the Human Research Ethics Committee of Bilkent University (ID: 2020_04_06_06) and was conducted in accordance with the ethical standards laid down in the 1964 Declaration of Helsinki and its later amendments or comparable ethical standards. Conflicts of Interest: There were no conflicts of interest in this project. Consent to Participate: Tuvana Dilan Karaduman, one of the authors and the human actor in the database, consented to participate in an action video database. The participants in the normative studies gave consent to participate in the study. Consent for Publication: The human actor provided consent for her videos and images to be published in academic journals and conferences, and to be shared with other researchers upon academic request. Participants of the normative studies consented for their de-identified data to be made publicly accessible for research purposes.*