Modelling Multi-Party Interactions among Virtual Characters, Robots, and Humans

Zerrin Yumak, Jianfeng Ren, Nadia Magnenat Thalmann, Junsong Yuan

Research output: Journal PublicationArticlepeer-review

21 Citations (Scopus)

Abstract

3D virtual humans and physical human-like robots can be used to interact with people in a remote location in order to increase the feeling of presence. In a telepresence setup, their behaviors are driven by real participants. We envision that in the absence of the real users, when they have to leave or they do not want to do a repetitive task, the control of the robots can be handed to an artificial intelligence component to sustain the ongoing interaction. At the point when human-mediated interaction is required again, control can be returned to the real users. One of the main challenges in telepresence research is the adaptation of 3D position and orientation of the remote participants to the actual physical environment to have appropriate eye contact and gesture awareness in a group conversation. In case the human behind the robot and/or virtual human leaves, multi-party interaction should be handed to an artificial intelligence component. In this paper, we discuss the challenges in autonomous multi-party interaction among virtual characters, human-like robots, and real participants, and describe a prototype system to study these challenges.

Original languageEnglish
Pages (from-to)172-190
Number of pages19
JournalPresence: Teleoperators and Virtual Environments
Volume23
Issue number2
DOIs
Publication statusPublished - 2014
Externally publishedYes

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Modelling Multi-Party Interactions among Virtual Characters, Robots, and Humans'. Together they form a unique fingerprint.

Cite this