[HTML][HTML] Neural machine translation: A review of methods, resources, and tools

Z Tan, S Wang, Z Yang, G Chen, X Huang, M Sun… - AI Open, 2020 - Elsevier
Abstract Machine translation (MT) is an important sub-field of natural language processing
that aims to translate natural languages using computers. In recent years, end-to-end neural …

Deep transfer learning & beyond: Transformer language models in information systems research

R Gruetzemacher, D Paradice - ACM Computing Surveys (CSUR), 2022 - dl.acm.org
AI is widely thought to be poised to transform business, yet current perceptions of the scope
of this transformation may be myopic. Recent progress in natural language processing …

From center to surrounding: An interactive learning framework for hyperspectral image classification

J Yang, B Du, L Zhang - ISPRS Journal of Photogrammetry and Remote …, 2023 - Elsevier
Owing to rich spectral and spatial information, hyperspectral image (HSI) can be utilized for
finely classifying different land covers. With the emergence of deep learning techniques …

Multi-level representation learning with semantic alignment for referring video object segmentation

D Wu, X Dong, L Shao, J Shen - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
Referring video object segmentation (RVOS) is a challenging language-guided video
grounding task, which requires comprehensively understanding the semantic information of …

Image captioning through image transformer

S He, W Liao, HR Tavakoli, M Yang… - Proceedings of the …, 2020 - openaccess.thecvf.com
Automatic captioning of images is a task that combines the challenges of image analysis
and text generation. One important aspect of captioning is the notion of attention: how to …

Introduction to transformers: an nlp perspective

T **ao, J Zhu - arxiv preprint arxiv:2311.17633, 2023 - arxiv.org
Transformers have dominated empirical machine learning models of natural language
processing. In this paper, we introduce basic concepts of Transformers and present key …

Fixed encoder self-attention patterns in transformer-based machine translation

A Raganato, Y Scherrer, J Tiedemann - arxiv preprint arxiv:2002.10260, 2020 - arxiv.org
Transformer-based models have brought a radical change to neural machine translation. A
key feature of the Transformer architecture is the so-called multi-head attention mechanism …

Muformer: A long sequence time-series forecasting model based on modified multi-head attention

P Zeng, G Hu, X Zhou, S Li, P Liu, S Liu - Knowledge-Based Systems, 2022 - Elsevier
Long sequence time-series forecasting (LSTF) problems are widespread in the real world,
such as weather forecasting, stock market forecasting, and power resource management …

[HTML][HTML] TaSbeeb: A judicial decision support system based on deep learning framework

HA Almuzaini, AM Azmi - Journal of King Saud University-Computer and …, 2023 - Elsevier
Since the early 1980s, the legal domain has shown a growing interest in Artificial
Intelligence approaches to tackle the increasing number of cases worldwide. TaSbeeb is a …

Tree-structured attention with hierarchical accumulation

XP Nguyen, S Joty, SCH Hoi, R Socher - arxiv preprint arxiv:2002.08046, 2020 - arxiv.org
Incorporating hierarchical structures like constituency trees has been shown to be effective
for various natural language processing (NLP) tasks. However, it is evident that state-of-the …