Human brain EEG indices of emotions: Delineating responses to affective vocalizations by measuring frontal theta event-related synchronization

MYV Bekkedal, J Rossi III, J Panksepp - Neuroscience & Biobehavioral …, 2011 - Elsevier
At present there is no direct brain measure of basic emotional dynamics from the human
brain. EEG provides non-invasive approaches for monitoring brain electrical activity to …

On the time course of vocal emotion recognition

MD Pell, SA Kotz - PlOS one, 2011 - journals.plos.org
How quickly do listeners recognize emotions from a speaker's voice, and does the time
course for recognition vary by emotion type? To address these questions, we adapted the …

Neural mechanisms for voice recognition

A Andics, JM McQueen, KM Petersson, V Gál, G Rudas… - Neuroimage, 2010 - Elsevier
We investigated neural mechanisms that support voice recognition in a training paradigm
with fMRI. The same listeners were trained on different weeks to categorize the mid-regions …

Emotional brain–computer interfaces

G Garcia-Molina, T Tsoneva… - International journal of …, 2013 - inderscienceonline.com
Research in brain–computer interface (BCI) has significantly increased during the last few
years. Additionally to their initial role as assisting devices for the physically challenged, BCIs …

Acoustic processing of temporally modulated sounds in infants: evidence from a combined near-infrared spectroscopy and EEG study

S Telkemeyer, S Rossi, T Nierhaus… - Frontiers in …, 2011 - frontiersin.org
Speech perception requires rapid extraction of the linguistic content from the acoustic signal.
The ability to efficiently process rapid changes in auditory information is important for …

Embodied listening and timbre: Perceptual, acoustical, and neural correlates

Z Wallmark, M Iacoboni, C Deblieck… - Music Perception: An …, 2018 - online.ucpress.edu
Timbre plays an essential role in transmitting musical affect, and in recent years, our
understanding of emotional expression in music has been enriched by contributions from …

On how the brain decodes vocal cues about speaker confidence

X Jiang, MD Pell - Cortex, 2015 - Elsevier
In speech communication, listeners must accurately decode vocal cues that refer to the
speaker's mental state, such as their confidence or 'feeling of knowing'. However, the time …

[HTML][HTML] Standing sentinel during human sleep: continued evaluation of environmental stimuli in the absence of consciousness

C Blume, R Del Giudice, M Wislowska, DPJ Heib… - Neuroimage, 2018 - Elsevier
While it is a well-established finding that subjects' own names (SON) and familiar voices are
salient during wakefulness, we here investigated processing of environmental stimuli during …

The time course of emotion recognition in speech and music

H Nordström, P Laukka - The Journal of the Acoustical Society of …, 2019 - pubs.aip.org
The auditory gating paradigm was adopted to study how much acoustic information is
needed to recognize emotions from speech prosody and music performances. In Study 1 …

Seeing emotion with your ears: emotional prosody implicitly guides visual attention to faces

S Rigoulot, MD Pell - PloS one, 2012 - journals.plos.org
Interpersonal communication involves the processing of multimodal emotional cues,
particularly facial expressions (visual modality) and emotional speech prosody (auditory …