Estimating user's engagement from eye-gaze behaviors in human-agent conversations

YI Nakano, R Ishii - Proceedings of the 15th international conference on …, 2010‏ - dl.acm.org
In face-to-face conversations, speakers are continuously checking whether the listener is
engaged in the conversation and change the conversational strategy if the listener is not …

A speech-driven hand gesture generation method and evaluation in android robots

CT Ishi, D Machiyashiki, R Mikata… - IEEE Robotics and …, 2018‏ - ieeexplore.ieee.org
Hand gestures commonly occur in daily dialogue interactions, and have important functions
in communication. We first analyzed a multimodal human–human dialogue data and found …

Gaze awareness in conversational agents: Estimating a user's conversational engagement from eye gaze

R Ishii, YI Nakano, T Nishida - ACM Transactions on Interactive …, 2013‏ - dl.acm.org
In face-to-face conversations, speakers are continuously checking whether the listener is
engaged in the conversation, and they change their conversational strategy if the listener is …

Probabilistic human-like gesture synthesis from speech using GRU-based WGAN

B Wu, C Liu, CT Ishi, H Ishiguro - … of the 2021 international conference on …, 2021‏ - dl.acm.org
Gestures are crucial for increasing the human-likeness of agents and robots to achieve
smoother interactions with humans. The realization of an effective system to model human …

Impact of personality on nonverbal behavior generation

R Ishii, C Ahuja, YI Nakano, LP Morency - Proceedings of the 20th ACM …, 2020‏ - dl.acm.org
To realize natural-looking virtual agents, one key technical challenge is to automatically
generate nonverbal behaviors from spoken language. Since nonverbal behavior varies …

Visual and linguistic information in gesture classification

J Eisenstein, R Davis - ACM SIGGRAPH 2007 courses, 2007‏ - dl.acm.org
Classification of natural hand gestures is usually approached by applying pattern
recognition to the movements of the hand. However, the gesture categories most frequently …

A robot for reconstructing presentation behavior in lecture

T Ishino, M Goto, A Kashihara - … of the 6th International Conference on …, 2018‏ - dl.acm.org
In universities, lecturers often use presentation slides to present their lecture contents with
non-verbal behavior involving paralanguage, gaze and gesture, which is so important for …

Selecting Iconic Gesture Forms Based on Typical Entity Images

YI Nakano, F Nihei, R Ishii… - Journal of Information …, 2024‏ - jstage.jst.go.jp
Hand gestures are communication signals that emphasize an important part of an utterance
and express the concept of emphasized words. Iconic gestures are hand gestures that …

Barrier Function to Skin Elasticity in Talking Head

I Chaturvedi, V Pandelea, E Cambria, R Welsch… - Cognitive …, 2024‏ - Springer
In this paper, we target the problem of generating facial expressions from a piece of audio.
This is challenging since both audio and video have inherent characteristics that are distinct …

[PDF][PDF] The design of a generic framework for integrating ECA components.

HH Huang, T Nishida, A Cerekovic, IS Pandzic… - AAMAS (1), 2008‏ - academia.edu
ABSTRACT Embodied Conversational Agents (ECAs) are life-like computer generated
characters that interact with human users in face-to-face multi-modal conversations. ECA …