Home > Publications
Home University of Twente
Prospective Students
Intranet (internal)

EEMCS EPrints Service

27850 A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction
Home Policy Brochure Browse Search User Area Contact Help

Jung, M.M. and Poel, M. and Reidsma, D. and Heylen, D.K.J. (2017) A First Step toward the Automatic Understanding of Social Touch for Naturalistic Human–Robot Interaction. Frontiers in ICT, 4. 3. ISSN 2297-198X

Full text available as:


856 Kb
Open Access

Official URL:


Social robots should be able to automatically understand and respond to human touch. The meaning of touch does not only depend on the form of touch but also on the context in which the touch takes place. To gain more insight into the factors that are relevant to interpret the meaning of touch within a social context we elicited touch behaviors by letting participants interact with a robot pet companion in the context of different affective scenarios. In a contextualized lab setting, participants (n = 31) acted as if they were coming home in different emotional states (i.e., stressed, depressed, relaxed, and excited) without being given specific instructions on the kinds of behaviors that they should display. Based on video footage of the interactions and interviews we explored the use of touch behaviors, the expressed social messages, and the expected robot pet responses. Results show that emotional state influenced the social messages that were communicated to the robot pet as well as the expected responses. Furthermore, it was found that multimodal cues were used to communicate with the robot pet, that is, participants often talked to the robot pet while touching it and making eye contact. Additionally, the findings of this study indicate that the categorization of touch behaviors into discrete touch gesture categories based on dictionary definitions is not a suitable approach to capture the complex nature of touch behaviors in less controlled settings. These findings can inform the design of a behavioral model for robot pet companions and future directions to interpret touch behaviors in less controlled settings are discussed.

Item Type:Article
Research Group:EWI-HMI: Human Media Interaction
Research Program:CTIT-General
Research Project:COMMIT/P04+P05: Virtual Worlds for Well-being
Uncontrolled Keywords:social touch, human–robot interaction, robot pet companion, multimodal interaction, touch recognition, behavior analysis, affective context
ID Code:27850
Deposited On:22 April 2017
More Information:statistics

Export this item as:

To correct this item please ask your editor

Repository Staff Only: edit this item