go back

Explainable Human-Robot Interaction for imitation learning in Augmented Reality

Anna Belardinelli, Chao Wang, Michael Gienger, "Explainable Human-Robot Interaction for imitation learning in Augmented Reality", 16th International Workshop on Human-Friendly Robotics, 2023.

Abstract

Imitation learning could enable non-expert users to teach new skills to robots in an interactive and intuitive way. Still, when teaching a task, it is often difficult to grasp what the robot knows or to assess if a correct task representation is being formed. To address this problem, suitable online feedback should be given by the robot to explain its perceptual beliefs. Here, we introduce an explainable design for human-robot interaction during learning by demonstration of simple kitchen tasks in Augmented Reality. To communicate the robot feedback during the demonstration two modalities are explored: one purely AR-based (AR-XAI), by which the perceptual beliefs of the robot are interactively overlaid on the shared workspace, as perceived by the teacher; and one more human-like (gaze-speech), where the robot’s understanding is signaled by gaze following and verbal utterances. We conducted a user study to assess which modality is more effective in shaping the user perception and to evaluate whether a combination of holographic and embodied social cues would further improve the user experience. Our results show that the multi-modal combination is indeed best appreciated and contributes to a better perception of the robot, especially in non-experts.



Download Bibtex file Per Mail Request

Search