When Agents Become Partners: A Review of the Role the Implicit Plays in the Interaction with Artificial Social Agents


The way we interact with computers has significantly changed over recent decades. However, interaction with computers still falls behind human to human interaction in terms of seamlessness, effortlessness, and satisfaction. We argue that simultaneously using verbal, nonverbal, explicit, implicit, intentional, and unintentional communication channels addresses these three aspects of the interaction process. To better understand what has been done in the field of Human Computer Interaction (HCI) in terms of incorporating the type channels mentioned above, we reviewed the literature on implicit nonverbal interaction with a specific emphasis on the interaction between humans on the one side, and robot and virtual humans on the other side. These Artificial Social Agents (ASA) are increasingly used as advanced tools for solving not only physical but also social tasks. In the literature review, we identify domains of interaction between humans and artificial social agents that have shown exponential growth over the years. The review highlights the value of incorporating implicit interaction capabilities in Human Agent Interaction (HAI) which we believe will lead to satisfying human and artificial social agent team performance. We conclude the article by presenting a case study of a system that harnesses subtle nonverbal, implicit interaction to increase the state of relaxation in users. This “Virtual Human Breathing Relaxation System” works on the principle of physiological synchronisation between a human and a virtual, computer-generated human. The active entrainment concept behind the relaxation system is generic and can be applied to other human agent interaction domains of implicit physiology-based interaction.

Publication DOI: https://doi.org/10.3390/mti4040081
Divisions: College of Engineering & Physical Sciences > School of Computer Science and Digital Technologies > Software Engineering & Cybersecurity
College of Engineering & Physical Sciences
College of Engineering & Physical Sciences > School of Computer Science and Digital Technologies > Applied AI & Robotics
College of Engineering & Physical Sciences > Smart and Sustainable Manufacturing
College of Engineering & Physical Sciences > Aston Centre for Artifical Intelligence Research and Application
College of Engineering & Physical Sciences > School of Computer Science and Digital Technologies
?? 50811700Jl ??
Additional Information: © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Uncontrolled Keywords: verbal and nonverbal communication,interaction,explicit and implicit interaction,human to human interaction,human to human communication,human-robot interaction,human-agent interaction,human virtual-human interaction,gaze,proxemics,physiological synchrony,entrainment,Proxemics,Gaze,Interaction,Human to human communication,Explicit and implicit interaction,Human to human interaction,Human virtual-human interaction,Verbal and nonverbal communication,Human-robot interaction,Entrainment,Physiological synchrony,Human-agent interaction,Human-Computer Interaction,Neuroscience (miscellaneous),Computer Networks and Communications,Computer Science Applications
Publication ISSN: 2414-4088
Last Modified: 12 Feb 2024 08:42
Date Deposited: 25 Nov 2020 08:18
Full Text Link:
Related URLs: https://www.mdp ... 414-4088/4/4/81 (Publisher URL)
http://www.scop ... tnerID=8YFLogxK (Scopus URL)
PURE Output Type: Review article
Published Date: 2020-11-22
Accepted Date: 2020-11-19
Authors: Dar, Sanobar (ORCID Profile 0000-0002-2287-7001)
Bernardet, Ulysses (ORCID Profile 0000-0003-4659-3035)



Version: Published Version

License: Creative Commons Attribution

| Preview

Export / Share Citation


Additional statistics for this record