Neural machine translation: A review

F Stahlberg - Journal of Artificial Intelligence Research, 2020 - jair.org
The field of machine translation (MT), the automatic translation of written text from one
natural language into another, has experienced a major paradigm shift in recent years …

Conventional and contemporary approaches used in text to speech synthesis: A review

N Kaur, P Singh - Artificial Intelligence Review, 2023 - Springer
Nowadays speech synthesis or text to speech (TTS), an ability of system to produce human
like natural sounding voice from the written text, is gaining popularity in the field of speech …

Towards efficient generative large language model serving: A survey from algorithms to systems

X Miao, G Oliaro, Z Zhang, X Cheng, H **… - arxiv preprint arxiv …, 2023 - arxiv.org
In the rapidly evolving landscape of artificial intelligence (AI), generative large language
models (LLMs) stand at the forefront, revolutionizing how we interact with our data. However …

Fastspeech: Fast, robust and controllable text to speech

Y Ren, Y Ruan, X Tan, T Qin, S Zhao… - Advances in neural …, 2019 - proceedings.neurips.cc
Neural network based end-to-end text to speech (TTS) has significantly improved the quality
of synthesized speech. Prominent methods (eg, Tacotron 2) usually first generate mel …

Pay less attention with lightweight and dynamic convolutions

F Wu, A Fan, A Baevski, YN Dauphin, M Auli - arxiv preprint arxiv …, 2019 - arxiv.org
Self-attention is a useful mechanism to build generative models for language and images. It
determines the importance of context elements by comparing each element to the current …

A survey on non-autoregressive generation for neural machine translation and beyond

Y **ao, L Wu, J Guo, J Li, M Zhang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Non-autoregressive (NAR) generation, which is first proposed in neural machine translation
(NMT) to speed up inference, has attracted much attention in both machine learning and …

Glancing transformer for non-autoregressive neural machine translation

L Qian, H Zhou, Y Bao, M Wang, L Qiu… - arxiv preprint arxiv …, 2020 - arxiv.org
Recent work on non-autoregressive neural machine translation (NAT) aims at improving the
efficiency by parallel decoding without sacrificing the quality. However, existing NAT …

A study on relu and softmax in transformer

K Shen, J Guo, X Tan, S Tang, R Wang… - arxiv preprint arxiv …, 2023 - arxiv.org
The Transformer architecture consists of self-attention and feed-forward networks (FFNs)
which can be viewed as key-value memories according to previous works. However, FFN …

[HTML][HTML] Neural machine translation: A review of methods, resources, and tools

Z Tan, S Wang, Z Yang, G Chen, X Huang, M Sun… - AI Open, 2020 - Elsevier
Abstract Machine translation (MT) is an important sub-field of natural language processing
that aims to translate natural languages using computers. In recent years, end-to-end neural …

Fastcorrect: Fast error correction with edit alignment for automatic speech recognition

Y Leng, X Tan, L Zhu, J Xu, R Luo… - Advances in …, 2021 - proceedings.neurips.cc
Error correction techniques have been used to refine the output sentences from automatic
speech recognition (ASR) models and achieve a lower word error rate (WER) than original …