An introduction to deep learning in natural language processing: Models, techniques, and tools

I Lauriola, A Lavelli, F Aiolli - Neurocomputing, 2022 - Elsevier
Abstract Natural Language Processing (NLP) is a branch of artificial intelligence that
involves the design and implementation of systems and algorithms able to interact through …

Spanish pre-trained bert model and evaluation data

J Cañete, G Chaperon, R Fuentes, JH Ho… - arxiv preprint arxiv …, 2023 - arxiv.org
The Spanish language is one of the top 5 spoken languages in the world. Nevertheless,
finding resources to train or evaluate Spanish language models is not an easy task. In this …

Deep convolutional neural networks with ensemble learning and transfer learning for capacity estimation of lithium-ion batteries

S Shen, M Sadoughi, M Li, Z Wang, C Hu - Applied Energy, 2020 - Elsevier
It is often difficult for a machine learning model trained based on a small size of
charge/discharge cycling data to produce satisfactory accuracy in the capacity estimation of …

Deep transfer learning for automatic speech recognition: Towards better generalization

H Kheddar, Y Himeur, S Al-Maadeed, A Amira… - Knowledge-Based …, 2023 - Elsevier
Automatic speech recognition (ASR) has recently become an important challenge when
using deep learning (DL). It requires large-scale training datasets and high computational …

Adversarial transfer learning for Chinese named entity recognition with self-attention mechanism

P Cao, Y Chen, K Liu, J Zhao, S Liu - Proceedings of the 2018 …, 2018 - aclanthology.org
Named entity recognition (NER) is an important task in natural language processing area,
which needs to determine entities boundaries and classify them into pre-defined categories …

Choosing transfer languages for cross-lingual learning

YH Lin, CY Chen, J Lee, Z Li, Y Zhang, M **a… - arxiv preprint arxiv …, 2019 - arxiv.org
Cross-lingual transfer, where a high-resource transfer language is used to improve the
accuracy of a low-resource task language, is now an invaluable tool for improving …

A survey on recent advances in sequence labeling from deep learning models

Z He, Z Wang, W Wei, S Feng, X Mao… - arxiv preprint arxiv …, 2020 - arxiv.org
Sequence labeling (SL) is a fundamental research problem encompassing a variety of tasks,
eg, part-of-speech (POS) tagging, named entity recognition (NER), text chunking, etc …

Robust multilingual part-of-speech tagging via adversarial training

M Yasunaga, J Kasai, D Radev - arxiv preprint arxiv:1711.04903, 2017 - arxiv.org
Adversarial training (AT) is a powerful regularization method for neural networks, aiming to
achieve robustness to input perturbations. Yet, the specific effects of the robustness obtained …

Dual adversarial neural transfer for low-resource named entity recognition

JT Zhou, H Zhang, D **, H Zhu, M Fang… - Proceedings of the …, 2019 - aclanthology.org
We propose a new neural transfer method termed Dual Adversarial Transfer Network
(DATNet) for addressing low-resource Named Entity Recognition (NER). Specifically, two …

On difficulties of cross-lingual transfer with order differences: A case study on dependency parsing

WU Ahmad, Z Zhang, X Ma, E Hovy, KW Chang… - arxiv preprint arxiv …, 2018 - arxiv.org
Different languages might have different word orders. In this paper, we investigate cross-
lingual transfer and posit that an order-agnostic model will perform better when transferring …