Rule-based adversarial sample generation for text classification

N Zhou, N Yao, J Zhao, Y Zhang - Neural Computing and Applications, 2022 - Springer
Abstract In Text Classification, modern neural networks have achieved great performance,
but simultaneously, it is sensitive to adversarial examples. Existing studies usually use …

Deep representation learning: Fundamentals, technologies, applications, and open challenges

A Payandeh, KT Baghaei, P Fayyazsanavi… - IEEE …, 2023 - ieeexplore.ieee.org
Machine learning algorithms have had a profound impact on the field of computer science
over the past few decades. The performance of these algorithms heavily depends on the …

Bstt: A bayesian spatial-temporal transformer for sleep staging

Y Liu, Z Jia - The Eleventh International Conference on Learning …, 2023 - openreview.net
Sleep staging is helpful in assessing sleep quality and diagnosing sleep disorders.
However, how to adequately capture the temporal and spatial relations of the brain during …

CareSleepNet: a hybrid deep learning network for automatic sleep staging

J Wang, S Zhao, H Jiang, Y Zhou, Z Yu… - IEEE Journal of …, 2024 - ieeexplore.ieee.org
Sleep staging is essential for sleep assessment and plays an important role in disease
diagnosis, which refers to the classification of sleep epochs into different sleep stages …

Towards exploring the limitations of active learning: An empirical study

Q Hu, Y Guo, M Cordy, X **e, W Ma… - 2021 36th IEEE/ACM …, 2021 - ieeexplore.ieee.org
Deep neural networks (DNNs) are increasingly deployed as integral parts of software
systems. However, due to the complex interconnections among hidden layers and massive …

Neuralmind-unicamp at 2022 trec neuclir: Large boring rerankers for cross-lingual retrieval

V Jeronymo, R Lotufo, R Nogueira - arxiv preprint arxiv:2303.16145, 2023 - arxiv.org
This paper reports on a study of cross-lingual information retrieval (CLIR) using the mT5-
XXL reranker on the NeuCLIR track of TREC 2022. Perhaps the biggest contribution of this …

PL-Transformer: a POS-aware and layer ensemble transformer for text classification

Y Shi, X Zhang, N Yu - Neural Computing and Applications, 2023 - Springer
The transformer-based models have become the de-facto standard for natural language
processing (NLP) tasks. However, most of these models are only designed to capture the …

Local-global coordination with transformers for referring image segmentation

F Liu, Y Kong, L Zhang, G Feng, B Yin - Neurocomputing, 2023 - Elsevier
Referring image segmentation has sprung up benefiting from the outstanding performance
of deep neural networks. However, most existing methods explore either local details or the …

Transferable post-hoc calibration on pretrained transformers in noisy text classification

J Zhang, W Yao, X Chen, L Feng - … of the AAAI Conference on Artificial …, 2023 - ojs.aaai.org
Recent work has demonstrated that pretrained transformers are overconfident in text
classification tasks, which can be calibrated by the famous post-hoc calibration method …

Deep representation learning: Fundamentals, perspectives, applications, and open challenges

KT Baghaei, A Payandeh, P Fayyazsanavi… - arxiv preprint arxiv …, 2022 - arxiv.org
Machine Learning algorithms have had a profound impact on the field of computer science
over the past few decades. These algorithms performance is greatly influenced by the …