Sentiment analysis using deep learning approaches: an overview

O Habimana, Y Li, R Li, X Gu, G Yu - Science China Information Sciences, 2020 - Springer
Nowadays, with the increasing number of Web 2.0 tools, users generate huge amounts of
data in an enormous and dynamic way. In this regard, the sentiment analysis appeared to be …

Survey of neural text representation models

K Babić, S Martinčić-Ipšić, A Meštrović - Information, 2020 - mdpi.com
In natural language processing, text needs to be transformed into a machine-readable
representation before any processing. The quality of further natural language processing …

Ordered neurons: Integrating tree structures into recurrent neural networks

Y Shen, S Tan, A Sordoni, A Courville - arxiv preprint arxiv:1810.09536, 2018 - arxiv.org
Natural language is hierarchically structured: smaller units (eg, phrases) are nested within
larger units (eg, clauses). When a larger constituent ends, all of the smaller constituents that …

Deep reinforcement learning

SE Li - Reinforcement learning for sequential decision and …, 2023 - Springer
Similar to humans, RL agents use interactive learning to successfully obtain satisfactory
decision strategies. However, in many cases, it is desirable to learn directly from …

Inducing target-specific latent structures for aspect sentiment classification

C Chen, Z Teng, Y Zhang - … of the 2020 conference on empirical …, 2020 - aclanthology.org
Aspect-level sentiment analysis aims to recognize the sentiment polarity of an aspect or a
target in a comment. Recently, graph convolutional networks based on linguistic …

Graph convolutional encoders for syntax-aware neural machine translation

J Bastings, I Titov, W Aziz, D Marcheggiani… - arxiv preprint arxiv …, 2017 - arxiv.org
We present a simple and effective approach to incorporating syntactic structure into neural
attention-based encoder-decoder models for machine translation. We rely on graph …

Tree transformer: Integrating tree structures into self-attention

YS Wang, HY Lee, YN Chen - arxiv preprint arxiv:1909.06639, 2019 - arxiv.org
Pre-training Transformer from large-scale raw texts and fine-tuning on the desired task have
achieved state-of-the-art results on diverse NLP tasks. However, it is unclear what the …

A simple, fast diverse decoding algorithm for neural generation

J Li, W Monroe, D Jurafsky - arxiv preprint arxiv:1611.08562, 2016 - arxiv.org
In this paper, we propose a simple, fast decoding algorithm that fosters diversity in neural
generation. The algorithm modifies the standard beam search algorithm by adding an inter …

QCD-aware recursive neural networks for jet physics

G Louppe, K Cho, C Becot, K Cranmer - Journal of High Energy Physics, 2019 - Springer
A bstract Recent progress in applying machine learning for jet physics has been built upon
an analogy between calorimeters and images. In this work, we present a novel class of …

Learning structured representation for text classification via reinforcement learning

T Zhang, M Huang, L Zhao - Proceedings of the AAAI conference on …, 2018 - ojs.aaai.org
Abstract Representation learning is a fundamental problem in natural language processing.
This paper studies how to learn a structured representation for text classification. Unlike …