Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Decoding methods in neural language generation: a survey
Neural encoder-decoder models for language generation can be trained to predict words
directly from linguistic or non-linguistic inputs. When generating with these so-called end-to …
directly from linguistic or non-linguistic inputs. When generating with these so-called end-to …
If beam search is the answer, what was the question?
Quite surprisingly, exact maximum a posteriori (MAP) decoding of neural language
generators frequently leads to low-quality results. Rather, most state-of-the-art results on …
generators frequently leads to low-quality results. Rather, most state-of-the-art results on …
Learning to reason deductively: Math word problem solving as complex relation extraction
Solving math word problems requires deductive reasoning over the quantities in the text.
Various recent research efforts mostly relied on sequence-to-sequence or sequence-to-tree …
Various recent research efforts mostly relied on sequence-to-sequence or sequence-to-tree …
Transformer neural network for protein-specific de novo drug generation as a machine translation problem
D Grechishnikova - Scientific reports, 2021 - nature.com
Drug discovery for a protein target is a very laborious, long and costly process. Machine
learning approaches and, in particular, deep generative networks can substantially reduce …
learning approaches and, in particular, deep generative networks can substantially reduce …
On decoding strategies for neural text generators
When generating text from probabilistic models, the chosen decoding strategy has a
profound effect on the resulting text. Yet the properties elicited by various decoding …
profound effect on the resulting text. Yet the properties elicited by various decoding …
Is MAP decoding all you need? the inadequacy of the mode in neural machine translation
Recent studies have revealed a number of pathologies of neural machine translation (NMT)
systems. Hypotheses explaining these mostly suggest there is something fundamentally …
systems. Hypotheses explaining these mostly suggest there is something fundamentally …
Understanding and improving sequence-to-sequence pretraining for neural machine translation
In this paper, we present a substantial step in better understanding the SOTA sequence-to-
sequence (Seq2Seq) pretraining for neural machine translation~(NMT). We focus on …
sequence (Seq2Seq) pretraining for neural machine translation~(NMT). We focus on …
Machine translation decoding beyond beam search
Beam search is the go-to method for decoding auto-regressive machine translation models.
While it yields consistent improvements in terms of BLEU, it is only concerned with finding …
While it yields consistent improvements in terms of BLEU, it is only concerned with finding …
Language model evaluation beyond perplexity
We propose an alternate approach to quantifying how well language models learn natural
language: we ask how well they match the statistical tendencies of natural language. To …
language: we ask how well they match the statistical tendencies of natural language. To …
Plate: Visually-grounded planning with transformers in procedural tasks
In this work, we study the problem of how to leverage instructional videos to facilitate the
understanding of human decision-making processes, focusing on training a model with the …
understanding of human decision-making processes, focusing on training a model with the …