Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Efficient methods for natural language processing: A survey
Recent work in natural language processing (NLP) has yielded appealing results from
scaling model parameters and training data; however, using only scale to improve …
scaling model parameters and training data; however, using only scale to improve …
Accelerating transformer inference for translation via parallel decoding
Autoregressive decoding limits the efficiency of transformers for Machine Translation (MT).
The community proposed specific network architectures and learning-based methods to …
The community proposed specific network architectures and learning-based methods to …
Deep encoder, shallow decoder: Reevaluating non-autoregressive machine translation
Much recent effort has been invested in non-autoregressive neural machine translation,
which appears to be an efficient alternative to state-of-the-art autoregressive machine …
which appears to be an efficient alternative to state-of-the-art autoregressive machine …
One country, 700+ languages: NLP challenges for underrepresented languages and dialects in Indonesia
NLP research is impeded by a lack of resources and awareness of the challenges presented
by underrepresented languages and dialects. Focusing on the languages spoken in …
by underrepresented languages and dialects. Focusing on the languages spoken in …
Fully non-autoregressive neural machine translation: Tricks of the trade
Fully non-autoregressive neural machine translation (NAT) is proposed to simultaneously
predict tokens with single forward of neural networks, which significantly reduces the …
predict tokens with single forward of neural networks, which significantly reduces the …
MulDA: A multilingual data augmentation framework for low-resource cross-lingual NER
Abstract Named Entity Recognition (NER) for low-resource languages is a both practical and
challenging research problem. This paper addresses zero-shot transfer for cross-lingual …
challenging research problem. This paper addresses zero-shot transfer for cross-lingual …
Imitation attacks and defenses for black-box machine translation systems
Adversaries may look to steal or attack black-box NLP systems, either for financial gain or to
exploit model errors. One setting of particular interest is machine translation (MT), where …
exploit model errors. One setting of particular interest is machine translation (MT), where …
Losing Heads in the Lottery: Pruning Transformer
The attention mechanism is the crucial component of the transformer architecture. Recent
research shows that most attention heads are not confident in their decisions and can be …
research shows that most attention heads are not confident in their decisions and can be …
When attention meets fast recurrence: Training language models with reduced compute
T Lei - arxiv preprint arxiv:2102.12459, 2021 - arxiv.org
Large language models have become increasingly difficult to train because of the growing
computation time and cost. In this work, we present SRU++, a highly-efficient architecture …
computation time and cost. In this work, we present SRU++, a highly-efficient architecture …
Finetuning pretrained transformers into rnns
Transformers have outperformed recurrent neural networks (RNNs) in natural language
generation. But this comes with a significant computational cost, as the attention …
generation. But this comes with a significant computational cost, as the attention …