Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Neural machine translation: A review
F Stahlberg - Journal of Artificial Intelligence Research, 2020 - jair.org
The field of machine translation (MT), the automatic translation of written text from one
natural language into another, has experienced a major paradigm shift in recent years …
natural language into another, has experienced a major paradigm shift in recent years …
Conventional and contemporary approaches used in text to speech synthesis: A review
N Kaur, P Singh - Artificial Intelligence Review, 2023 - Springer
Nowadays speech synthesis or text to speech (TTS), an ability of system to produce human
like natural sounding voice from the written text, is gaining popularity in the field of speech …
like natural sounding voice from the written text, is gaining popularity in the field of speech …
Towards efficient generative large language model serving: A survey from algorithms to systems
In the rapidly evolving landscape of artificial intelligence (AI), generative large language
models (LLMs) stand at the forefront, revolutionizing how we interact with our data. However …
models (LLMs) stand at the forefront, revolutionizing how we interact with our data. However …
Fastspeech: Fast, robust and controllable text to speech
Neural network based end-to-end text to speech (TTS) has significantly improved the quality
of synthesized speech. Prominent methods (eg, Tacotron 2) usually first generate mel …
of synthesized speech. Prominent methods (eg, Tacotron 2) usually first generate mel …
Pay less attention with lightweight and dynamic convolutions
Self-attention is a useful mechanism to build generative models for language and images. It
determines the importance of context elements by comparing each element to the current …
determines the importance of context elements by comparing each element to the current …
A survey on non-autoregressive generation for neural machine translation and beyond
Non-autoregressive (NAR) generation, which is first proposed in neural machine translation
(NMT) to speed up inference, has attracted much attention in both machine learning and …
(NMT) to speed up inference, has attracted much attention in both machine learning and …
Glancing transformer for non-autoregressive neural machine translation
Recent work on non-autoregressive neural machine translation (NAT) aims at improving the
efficiency by parallel decoding without sacrificing the quality. However, existing NAT …
efficiency by parallel decoding without sacrificing the quality. However, existing NAT …
A study on relu and softmax in transformer
The Transformer architecture consists of self-attention and feed-forward networks (FFNs)
which can be viewed as key-value memories according to previous works. However, FFN …
which can be viewed as key-value memories according to previous works. However, FFN …
[HTML][HTML] Neural machine translation: A review of methods, resources, and tools
Abstract Machine translation (MT) is an important sub-field of natural language processing
that aims to translate natural languages using computers. In recent years, end-to-end neural …
that aims to translate natural languages using computers. In recent years, end-to-end neural …
Fastcorrect: Fast error correction with edit alignment for automatic speech recognition
Error correction techniques have been used to refine the output sentences from automatic
speech recognition (ASR) models and achieve a lower word error rate (WER) than original …
speech recognition (ASR) models and achieve a lower word error rate (WER) than original …