Neural encoding and decoding with distributed sentence representations
Building computational models to account for the cortical representation of language plays
an important role in understanding the human linguistic system. Recent progress in …
an important role in understanding the human linguistic system. Recent progress in …
Learning sentence representation with guidance of human attention
Recently, much progress has been made in learning general-purpose sentence
representations that can be used across domains. However, most of the existing models …
representations that can be used across domains. However, most of the existing models …
From words to phrases: neural basis of social event semantic composition
H Yang, Y Bi - Brain Structure and Function, 2022 - Springer
Events are typically composed of at least actions and entities. Both actions and entities have
been shown to be represented by neural structures respecting domain organizations in the …
been shown to be represented by neural structures respecting domain organizations in the …
Phrase embedding learning from internal and external information based on autoencoder
R Li, Q Yu, S Huang, L Shen, C Wei, X Sun - Information Processing & …, 2021 - Elsevier
Phrase embedding can improve the performance of multiple NLP tasks. Most of the previous
phrase-embedding methods that only use the external or internal semantic information of …
phrase-embedding methods that only use the external or internal semantic information of …
Memory, show the way: Memory based few shot word representation learning
Distributional semantic models (DSMs) generally require sufficient examples for a word to
learn a high quality representation. This is in stark contrast with human who can guess the …
learn a high quality representation. This is in stark contrast with human who can guess the …
Exploiting word internal structures for generic chinese sentence representation
We introduce a novel mixed characterword architecture to improve Chinese sentence
representations, by utilizing rich semantic information of word internal structures. Our …
representations, by utilizing rich semantic information of word internal structures. Our …
Investigating inner properties of multimodal representation and semantic compositionality with brain-based componential semantics
Multimodal models have been proven to outperform text-based approaches on learning
semantic representations. However, it still remains unclear what properties are encoded in …
semantic representations. However, it still remains unclear what properties are encoded in …
Phrase embedding learning based on external and internal context with compositionality constraint
Different methods are proposed to learn phrase embedding, which can be mainly divided
into two strands. The first strand is based on the distributional hypothesis to treat a phrase as …
into two strands. The first strand is based on the distributional hypothesis to treat a phrase as …
Self-supervised phrase embedding method by fusing internal and external semantic information of phrases
R Li, C Wei, S Huang, N Yan - Multimedia Tools and Applications, 2023 - Springer
The quality of the phrase embedding is related to the performance of many NLP downstream
tasks. Most of the existing phrase embedding methods are difficult to achieve satisfactory …
tasks. Most of the existing phrase embedding methods are difficult to achieve satisfactory …