Masked language modeling and the distributional hypothesis: Order word matters pre-training for little
A possible explanation for the impressive performance of masked language model (MLM)
pre-training is that such models have learned to represent the syntactic structures prevalent …
pre-training is that such models have learned to represent the syntactic structures prevalent …
Recurrent convolutional neural networks for discourse compositionality
The compositionality of meaning extends beyond the single sentence. Just as words
combine to form the meaning of sentences, so do sentences combine to form the meaning of …
combine to form the meaning of sentences, so do sentences combine to form the meaning of …
[PDF][PDF] The role of syntax in vector space models of compositional semantics
Modelling the compositional process by which the meaning of an utterance arises from the
meaning of its parts is a fundamental task of Natural Language Processing. In this paper we …
meaning of its parts is a fundamental task of Natural Language Processing. In this paper we …
Exploiting deep learning for Persian sentiment analysis
The rise of social media is enabling people to freely express their opinions about products
and services. The aim of sentiment analysis is to automatically determine subject's sentiment …
and services. The aim of sentiment analysis is to automatically determine subject's sentiment …
" Not not bad" is not" bad": A distributional account of negation
With the increasing empirical success of distributional models of compositional semantics, it
is timely to consider the types of textual logic that such models are capable of capturing. In …
is timely to consider the types of textual logic that such models are capable of capturing. In …
Distributed representations for compositional semantics
KM Hermann - arxiv preprint arxiv:1411.3146, 2014 - arxiv.org
The mathematical representation of semantics is a key issue for Natural Language
Processing (NLP). A lot of research has been devoted to finding ways of representing the …
Processing (NLP). A lot of research has been devoted to finding ways of representing the …
Supervised and semi-supervised statistical models for word-based sentiment analysis
C Scheible - 2014 - elib.uni-stuttgart.de
Ever since its inception, sentiment analysis has relied heavily on methods that use words as
their basic unit. Even today, such methods deliver top performance. This way of representing …
their basic unit. Even today, such methods deliver top performance. This way of representing …
[PDF][PDF] Transduction Recursive Auto-Associative Memory: Learning Bilingual Compositional Distributed Vector Representations of Inversion Transduction Grammars
We introduce TRAAM, or Transduction RAAM, a fully bilingual generalization of Pollack's
(1990) monolingual Recursive Auto-Associative Memory neural networkmodel …
(1990) monolingual Recursive Auto-Associative Memory neural networkmodel …
[BOOK][B] Exploring the Limits of Systematicity of Natural Language Understanding Models
K Sinha - 2022 - search.proquest.com
In this thesis, we investigate several approaches to evaluate modern neural language
models through the lens of systematicity, in order to assess their human-level reasoning and …
models through the lens of systematicity, in order to assess their human-level reasoning and …
Encoder-decoder neural networks
N Kalchbrenner - 2017 - ora.ox.ac.uk
This thesis introduces the concept of an encoder-decoder neural network and develops
architectures for the construction of such networks. Encoder-decoder neural networks are …
architectures for the construction of such networks. Encoder-decoder neural networks are …