Publication

Body gesture recognition to control a social mobile robot

Conference Article

Conference

ACM/IEEE International Conference on Human-Robot Interaction (HRI)

Edition

2023

Pages

456-460

Doc link

https://doi.org/10.1145/3568294.3580126

File

Download the digital copy of the doc pdf document

Abstract

In this work, we propose a gesture-based language to allow humans to interact with robots using their body in a natural way. We have created a new gesture detection model using neural networks and a new dataset of humans making a collection of body gestures to train this architecture. Furthermore, we compare body gesture communication with other communication channels to demonstrate the importance of adding this knowledge to robots. The presented approach is validated in diverse simulations and real-life experiments with non-trained volunteers. This attains promising results and establishes that it is a valuable framework for social robotic applications, such as human robot collaboration or human-robot interaction.

Categories

automation.

Author keywords

Datasets, neural networks, Human-Robot Interaction

Scientific reference

J. Laplaza, R. Romero, A. Sanfeliu and A. Garrell Zulueta. Body gesture recognition to control a social mobile robot, 2023 ACM/IEEE International Conference on Human-Robot Interaction, 2023, Stockholm, in Companion of the HRI'23, pp. 456-460.