Multimodal sentiment analysis: A survey and comparison

R Kaur, S Kautish - Research anthology on implementing sentiment …, 2022 - igi-global.com
Multimodal sentiments have become the challenge for the researchers and are equally
sophisticated for an appliance to understand. One of the studies that support MS problems is …

Automatic engagement estimation in smart education/learning settings: a systematic review of engagement definitions, datasets, and methods

SN Karimah, S Hasegawa - Smart Learning Environments, 2022 - Springer
Background Recognizing learners' engagement during learning processes is important for
providing personalized pedagogical support and preventing dropouts. As learning …

Amigos: A dataset for affect, personality and mood research on individuals and groups

JA Miranda-Correa, MK Abadi… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
We present AMIGOS-A dataset for Multimodal research of affect, personality traits and mood
on Individuals and GrOupS. Different to other databases, we elicited affect using both short …

ASCERTAIN: Emotion and personality recognition using commercial sensors

R Subramanian, J Wache, MK Abadi… - IEEE Transactions …, 2016 - ieeexplore.ieee.org
We present ASCERTAIN-a multimodal databaASe for impliCit pERsonaliTy and Affect
recognitIoN using commercial physiological sensors. To our knowledge, ASCERTAIN is the …

Affective image content analysis: Two decades review and new perspectives

S Zhao, X Yao, J Yang, G Jia, G Ding… - … on Pattern Analysis …, 2021 - ieeexplore.ieee.org
Images can convey rich semantics and induce various emotions in viewers. Recently, with
the rapid advancement of emotional intelligence and the explosive growth of visual data …

Analysis of EEG signals and facial expressions for continuous emotion detection

M Soleymani, S Asghari-Esfeden, Y Fu… - IEEE Transactions on …, 2015 - ieeexplore.ieee.org
Emotions are time varying affective phenomena that are elicited as a result of stimuli. Videos
and movies in particular are made to elicit emotions in their audiences. Detecting the …

Multimodal emotion recognition in response to videos

M Soleymani, M Pantic, T Pun - IEEE transactions on affective …, 2011 - ieeexplore.ieee.org
This paper presents a user-independent emotion recognition method with the goal of
recovering affective tags for videos using electroencephalogram (EEG), pupillary response …

DECAF: MEG-based multimodal database for decoding affective physiological responses

MK Abadi, R Subramanian, SM Kia… - IEEE Transactions …, 2015 - ieeexplore.ieee.org
In this work, we present DECAF-a multimodal data set for decoding user physiological
responses to affective multimedia content. Different from data sets such as DEAP [15] and …

Hawkes processes for events in social media

MA Rizoiu, Y Lee, S Mishra, L **e - Frontiers of multimedia research, 2017 - dl.acm.org
This chapter provides an accessible introduction for point processes, and especially Hawkes
processes, for modeling discrete, inter-dependent events over continuous time. We start by …

Deep learning for video classification and captioning

Z Wu, T Yao, Y Fu, YG Jiang - Frontiers of multimedia research, 2017 - dl.acm.org
Today's digital contents are inherently multimedia: text, audio, image, video, and so on.
Video, in particular, has become a new way of communication between Internet users with …