go back

Investigating explainable human-robot interaction with augmented reality

Chao Wang and Anna Belardinelli, "Investigating explainable human-robot interaction with augmented reality", International Workshop on Virtual, Augmented, and Mixed-Reality for Human-Robot Interactions (VAM@HRI2022), 2022.

Abstract

In learning by demonstration with social robots, a fluid and coordinated interaction between human teacher and robotic learner is particularly critical and yet often difficult to assess. This is even more the case, if robots are to learn from non-expert users. In such cases, it is sometimes troublesome for the teacher to get a grasp of what the robot knows or to assess if a correct representation of the task has been formed even before the robot demonstrates it back. Here, we introduce a new feedback modality making use of Augmented Reality to visualize the perceptual beliefs of the robot in an interactive way. Such cues are indeed overlaid directly on the shared workspace, as perceived by the teacher, without the need for explicit inquiry. This allows the teacher to access the robot's situation understanding and adapt their demonstration online, while finally reviewing the observed sequence. We further propose an experimental framework to assess the benefits of such feedback modality - as compared to more established modalities such as gaze and speech - and to collect dyadic data in a quick, integrated and relatively realistic way. The planned user study will help to assess human-robot coordination across communicative cues and the combination of different modalities for explainable robotics.



Download Bibtex file Download PDF

Search