Understanding deep learning techniques for recognition of human emotions using facial expressions: A comprehensive survey

M Karnati, A Seal, D Bhattacharjee… - IEEE Transactions …, 2023‏ - ieeexplore.ieee.org
Emotion recognition plays a significant role in cognitive psychology research. However,
measuring emotions is a challenging task. Thus, several approaches have been designed …

An overview of deep semi-supervised learning

Y Ouali, C Hudelot, M Tami - arxiv preprint arxiv:2006.05278, 2020‏ - arxiv.org
Deep neural networks demonstrated their ability to provide remarkable performances on a
wide range of supervised learning tasks (eg, image classification) when trained on extensive …

Disc: Learning from noisy labels via dynamic instance-specific selection and correction

Y Li, H Han, S Shan, X Chen - Proceedings of the IEEE/CVF …, 2023‏ - openaccess.thecvf.com
Existing studies indicate that deep neural networks (DNNs) can eventually memorize the
label noise. We observe that the memorization strength of DNNs towards each instance is …

Early-learning regularization prevents memorization of noisy labels

S Liu, J Niles-Weed, N Razavian… - Advances in neural …, 2020‏ - proceedings.neurips.cc
We propose a novel framework to perform classification via deep learning in the presence of
noisy annotations. When trained on noisy labels, deep neural networks have been observed …

Suppressing uncertainties for large-scale facial expression recognition

K Wang, X Peng, J Yang, S Lu… - Proceedings of the IEEE …, 2020‏ - openaccess.thecvf.com
Annotating a qualitative large-scale facial expression dataset is extremely difficult due to the
uncertainties caused by ambiguous facial expressions, low-quality facial images, and the …

Normalized loss functions for deep learning with noisy labels

X Ma, H Huang, Y Wang, S Romano… - International …, 2020‏ - proceedings.mlr.press
Robust loss functions are essential for training accurate deep neural networks (DNNs) in the
presence of noisy (incorrect) labels. It has been shown that the commonly used Cross …

Self-training with noisy student improves imagenet classification

Q **e, MT Luong, E Hovy… - Proceedings of the IEEE …, 2020‏ - openaccess.thecvf.com
We present a simple self-training method that achieves 88.4% top-1 accuracy on ImageNet,
which is 2.0% better than the state-of-the-art model that requires 3.5 B weakly labeled …

Dividemix: Learning with noisy labels as semi-supervised learning

J Li, R Socher, SCH Hoi - arxiv preprint arxiv:2002.07394, 2020‏ - arxiv.org
Deep neural networks are known to be annotation-hungry. Numerous efforts have been
devoted to reducing the annotation cost when learning with deep networks. Two prominent …

Symmetric cross entropy for robust learning with noisy labels

Y Wang, X Ma, Z Chen, Y Luo, J Yi… - Proceedings of the …, 2019‏ - openaccess.thecvf.com
Training accurate deep neural networks (DNNs) in the presence of noisy labels is an
important and challenging task. Though a number of approaches have been proposed for …

CancerGPT for few shot drug pair synergy prediction using large pretrained language models

T Li, S Shetty, A Kamath, A Jaiswal, X Jiang… - NPJ Digital …, 2024‏ - nature.com
Large language models (LLMs) have been shown to have significant potential in few-shot
learning across various fields, even with minimal training data. However, their ability to …