A survey of textual emotion recognition and its challenges

J Deng, F Ren - IEEE Transactions on Affective Computing, 2021 - ieeexplore.ieee.org
Textual language is the most natural carrier of human emotion. In natural language
processing, textual emotion recognition (TER) has become an important topic due to its …

Text FCG: Fusing contextual information via graph learning for text classification

Y Wang, C Wang, J Zhan, W Ma, Y Jiang - Expert Systems with Applications, 2023 - Elsevier
Text classification as a fundamental task in Natural Language Processing (NLP). Graph
neural networks can better handle the large amount of information in text, and effective and …

Nyströmformer: A nyström-based algorithm for approximating self-attention

Y **ong, Z Zeng, R Chakraborty, M Tan… - Proceedings of the …, 2021 - ojs.aaai.org
Transformers have emerged as a powerful tool for a broad range of natural language
processing tasks. A key component that drives the impressive performance of Transformers …

[KNIHA][B] Pretrained transformers for text ranking: Bert and beyond

J Lin, R Nogueira, A Yates - 2022 - books.google.com
The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in
response to a query. Although the most common formulation of text ranking is search …

Quality prediction modeling for industrial processes using multiscale attention-based convolutional neural network

X Yuan, L Huang, L Ye, Y Wang… - IEEE transactions on …, 2024 - ieeexplore.ieee.org
Soft sensors have been increasingly applied for quality prediction in complex industrial
processes, which often have different scales of topology and highly coupled spatiotemporal …

“Low-resource” text classification: A parameter-free classification method with compressors

Z Jiang, M Yang, M Tsirlin, R Tang… - Findings of the …, 2023 - aclanthology.org
Deep neural networks (DNNs) are often used for text classification due to their high
accuracy. However, DNNs can be computationally intensive, requiring millions of …

Clear: Contrastive learning for sentence representation

Z Wu, S Wang, J Gu, M Khabsa, F Sun, H Ma - arxiv preprint arxiv …, 2020 - arxiv.org
Pre-trained language models have proven their unique powers in capturing implicit
language features. However, most pre-training approaches focus on the word-level training …

A term weighted neural language model and stacked bidirectional LSTM based framework for sarcasm identification

A Onan, MA Toçoğlu - Ieee Access, 2021 - ieeexplore.ieee.org
Sarcasm identification on text documents is one of the most challenging tasks in natural
language processing (NLP), has become an essential research direction, due to its …

[HTML][HTML] Hierarchical graph-based text classification framework with contextual node embedding and BERT-based dynamic fusion

A Onan - Journal of king saud university-computer and …, 2023 - Elsevier
We propose a novel hierarchical graph-based text classification framework that leverages
the power of contextual node embedding and BERT-based dynamic fusion to capture the …

Estimating training data influence by tracing gradient descent

G Pruthi, F Liu, S Kale… - Advances in Neural …, 2020 - proceedings.neurips.cc
We introduce a method called TracIn that computes the influence of a training example on a
prediction made by the model. The idea is to trace how the loss on the test point changes …