go back

Dyadic Interactions and Interpersonal Perception: An Exploration of Behavioral Cues for Technology-Assisted Mediation

Hifza Javed, Nina Moorman, Thomas Weisswange, Nawid Jamali, "Dyadic Interactions and Interpersonal Perception: An Exploration of Behavioral Cues for Technology-Assisted Mediation", 15th International Conference on Applied Human Factors and Ergonomics (AHFE 2024), 2024.

Abstract

Mediators aim to shape group dynamics in various ways, such as improving trust and cohesion, balancing participation, and promoting constructive conflict resolution. Technological systems used to mediate human-human interactions must be able to continuously assess the state of the interaction and generate appropriate actions. To this end, an understanding of the collective affective state of the group is needed in order to produce meaningful actions that can improve the interpersonal dynamics within the group. In this paper, we study behavioral cues that indicate interpersonal perception in dyadic social interactions. These cues may be used by technology-assisted mediation systems to evaluate the interpersonal perception and produce effective mediation strategies for group interactions. Traditional approaches for emotion recognition focus on evaluating individual internal states rather than group states. Additionally, prior research on group affect predominantly utilizes datasets that have been annotated by external observers. However, for technology-assisted mediation to be efficient, assistance should be based on the judgment of the group members themselves and must capture interpersonal affective states rather than individual, internal affect of each group member. In this work, to investigate interpersonal affect in dyadic interactions, we utilize a dataset containing 30 dyadic interaction sessions where individuals interact remotely in a negotiation task. The interactants retrospectively rate their interaction partners for how agreeable or disagreeable their interaction partner came across to them. This dataset provides ratings of affect that, firstly, are provided by the interactants themselves rather than external observers, secondly, record interpersonal perception from the perspective of each individual, and thirdly, evaluate their perception of their partner rather than their own affective states. This offers an opportunity to study how the observable behavioral cues that reflect interpersonal affect in an interaction may vary between the group members. We take a multi-perspective approach to evaluate interpersonal affect in dyadic interactions, employing computational models to investigate behavioral cues that reflect interpersonal perception. We extract two-dimensional facial landmarks and acoustic features for each interactant and utilize this multimodal feature space to conduct a study into the behavioral cues that inform the interpersonal perception. In particular, we use feature importance analysis with a random forest classifier to investigate how the behavior cues vary between the individual providing the rating and the individual being rated. We also examine how these cues may differ between segments of interactions that are labeled with negative and positive interpersonal affect. Using a multi-perspective approach that evaluates interpersonal affect from the perspective of each interactant, we find that including both interactants’ observations when predicting interpersonal affect effectively captures the interplay of behaviors between the interactants. Our findings also suggest that the facial features carry more predictive power than the acoustic features for both interactants in this dataset. In particular, head movements are found to be strongly correlated with interpersonal affect. Lastly, we also find that acoustic features carry more predictive power during interactions with negative interpersonal affect as compared to the positive. These findings offer nuanced insights into interpersonal dynamics in dyadic interactions, which will be beneficial for future work on technology-assisted social mediation.



Download Bibtex file Download PDF

Search