A primer on neural network models for natural language processing
Y Goldberg - Journal of Artificial Intelligence Research, 2016 - jair.org
Over the past few years, neural networks have re-emerged as powerful machine-learning
models, yielding state-of-the-art results in fields such as image recognition and speech …
models, yielding state-of-the-art results in fields such as image recognition and speech …
Multi-task learning in natural language processing: An overview
Deep learning approaches have achieved great success in the field of Natural Language
Processing (NLP). However, directly training deep neural models often suffer from overfitting …
Processing (NLP). However, directly training deep neural models often suffer from overfitting …
Multi-task learning with deep neural networks: A survey
M Crawshaw - arxiv preprint arxiv:2009.09796, 2020 - arxiv.org
Multi-task learning (MTL) is a subfield of machine learning in which multiple tasks are
simultaneously learned by a shared model. Such approaches offer advantages like …
simultaneously learned by a shared model. Such approaches offer advantages like …
Unit: Multimodal multitask learning with a unified transformer
Abstract We propose UniT, a Unified Transformer model to simultaneously learn the most
prominent tasks across different domains, ranging from object detection to natural language …
prominent tasks across different domains, ranging from object detection to natural language …
Modular deep learning
Transfer learning has recently become the dominant paradigm of machine learning. Pre-
trained models fine-tuned for downstream tasks achieve better performance with fewer …
trained models fine-tuned for downstream tasks achieve better performance with fewer …
Contextual string embeddings for sequence labeling
Recent advances in language modeling using recurrent neural networks have made it
viable to model language as distributions over characters. By learning to predict the next …
viable to model language as distributions over characters. By learning to predict the next …
Universal language model fine-tuning for text classification
Inductive transfer learning has greatly impacted computer vision, but existing approaches in
NLP still require task-specific modifications and training from scratch. We propose Universal …
NLP still require task-specific modifications and training from scratch. We propose Universal …
Detecting formal thought disorder by deep contextualized word representations
J Sarzynska-Wawer, A Wawer, A Pawlak… - Psychiatry …, 2021 - Elsevier
Computational linguistics has enabled the introduction of objective tools that measure some
of the symptoms of schizophrenia, including the coherence of speech associated with formal …
of the symptoms of schizophrenia, including the coherence of speech associated with formal …
Gradnorm: Gradient normalization for adaptive loss balancing in deep multitask networks
Deep multitask networks, in which one neural network produces multiple predictive outputs,
can offer better speed and performance than their single-task counterparts but are …
can offer better speed and performance than their single-task counterparts but are …
[BUCH][B] Neural network methods in natural language processing
Y Goldberg - 2017 - books.google.com
Neural networks are a family of powerful machine learning models and this book focuses on
their application to natural language data. The first half of the book (Parts I and II) covers the …
their application to natural language data. The first half of the book (Parts I and II) covers the …