Deep reinforcement learning: An overview

Y Li - ar** machines with comprehensive knowledge of the world's entities and their
relationships has been a longstanding goal of AI. Over the last decade, large-scale …

[HTML][HTML] GPT understands, too

X Liu, Y Zheng, Z Du, M Ding, Y Qian, Z Yang, J Tang - AI Open, 2024‏ - Elsevier
Prompting a pretrained language model with natural language patterns has been proved
effective for natural language understanding (NLU). However, our preliminary study reveals …

Graph neural networks for natural language processing: A survey

L Wu, Y Chen, K Shen, X Guo, H Gao… - … and Trends® in …, 2023‏ - nowpublishers.com
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …

Natural language processing advancements by deep learning: A survey

A Torfi, RA Shirvani, Y Keneshloo, N Tavaf… - arxiv preprint arxiv …, 2020‏ - arxiv.org
Natural Language Processing (NLP) helps empower intelligent machines by enhancing a
better understanding of the human language for linguistic-based human-computer …

An empirical evaluation of generic convolutional and recurrent networks for sequence modeling

S Bai, JZ Kolter, V Koltun - arxiv preprint arxiv:1803.01271, 2018‏ - arxiv.org
For most deep learning practitioners, sequence modeling is synonymous with recurrent
networks. Yet recent results indicate that convolutional architectures can outperform …

Universal language model fine-tuning for text classification

J Howard, S Ruder - arxiv preprint arxiv:1801.06146, 2018‏ - arxiv.org
Inductive transfer learning has greatly impacted computer vision, but existing approaches in
NLP still require task-specific modifications and training from scratch. We propose Universal …

Simple bert models for relation extraction and semantic role labeling

P Shi, J Lin - arxiv preprint arxiv:1904.05255, 2019‏ - arxiv.org
We present simple BERT-based models for relation extraction and semantic role labeling. In
recent years, state-of-the-art performance has been achieved using neural models by …

Semantics-aware BERT for language understanding

Z Zhang, Y Wu, H Zhao, Z Li, S Zhang, X Zhou… - Proceedings of the …, 2020‏ - ojs.aaai.org
The latest work on language representations carefully integrates contextualized features into
language model training, which enables a series of success especially in various machine …

Detecting formal thought disorder by deep contextualized word representations

J Sarzynska-Wawer, A Wawer, A Pawlak… - Psychiatry …, 2021‏ - Elsevier
Computational linguistics has enabled the introduction of objective tools that measure some
of the symptoms of schizophrenia, including the coherence of speech associated with formal …