[PDF][PDF] Recent advances in end-to-end automatic speech recognition
J Li - APSIPA Transactions on Signal and Information …, 2022 - nowpublishers.com
Recently, the speech community is seeing a significant trend of moving from deep neural
network based hybrid modeling to end-to-end (E2E) modeling for automatic speech …
network based hybrid modeling to end-to-end (E2E) modeling for automatic speech …
Speech recognition using deep neural networks: A systematic review
Over the past decades, a tremendous amount of research has been done on the use of
machine learning for speech processing applications, especially speech recognition …
machine learning for speech processing applications, especially speech recognition …
Contrastive representation distillation
Often we wish to transfer representational knowledge from one neural network to another.
Examples include distilling a large network into a smaller one, transferring knowledge from …
Examples include distilling a large network into a smaller one, transferring knowledge from …
Split computing and early exiting for deep learning applications: Survey and research challenges
Mobile devices such as smartphones and autonomous vehicles increasingly rely on deep
neural networks (DNNs) to execute complex inference tasks such as image classification …
neural networks (DNNs) to execute complex inference tasks such as image classification …
Sequence-level knowledge distillation
Neural machine translation (NMT) offers a novel alternative formulation of translation that is
potentially simpler than statistical approaches. However to reach competitive performance …
potentially simpler than statistical approaches. However to reach competitive performance …
Emotion recognition in speech using cross-modal transfer in the wild
Obtaining large, human labelled speech datasets to train models for emotion recognition is a
notoriously challenging task, hindered by annotation cost and label ambiguity. In this work …
notoriously challenging task, hindered by annotation cost and label ambiguity. In this work …
Policy distillation
Policies for complex visual tasks have been successfully learned with deep reinforcement
learning, using an approach called deep Q-networks (DQN), but relatively large (task …
learning, using an approach called deep Q-networks (DQN), but relatively large (task …
Interpretable deep models for ICU outcome prediction
Exponential surge in health care data, such as longitudinal data from electronic health
records (EHR), sensor data from intensive care unit (ICU), etc., is providing new …
records (EHR), sensor data from intensive care unit (ICU), etc., is providing new …
Distillation-based training for multi-exit architectures
Multi-exit architectures, in which a stack of processing layers is interleaved with early output
layers, allow the processing of a test example to stop early and thus save computation time …
layers, allow the processing of a test example to stop early and thus save computation time …
Deep learning based regression for optically inactive inland water quality parameter estimation using airborne hyperspectral imagery
Airborne hyperspectral remote sensing has the characteristics of high spatial and spectral
resolutions, and provides an opportunity for accurate and efficient inland water qauality …
resolutions, and provides an opportunity for accurate and efficient inland water qauality …