Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A survey on deep neural network pruning: Taxonomy, comparison, analysis, and recommendations
Modern deep neural networks, particularly recent large language models, come with
massive model sizes that require significant computational and storage resources. To …
massive model sizes that require significant computational and storage resources. To …
Continual learning for recurrent neural networks: an empirical evaluation
Learning continuously during all model lifetime is fundamental to deploy machine learning
solutions robust to drifts in the data distribution. Advances in Continual Learning (CL) with …
solutions robust to drifts in the data distribution. Advances in Continual Learning (CL) with …
Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …
reduce the size of neural networks by selectively pruning components. Similarly to their …
Analyzing multi-head self-attention: Specialized heads do the heavy lifting, the rest can be pruned
Multi-head self-attention is a key component of the Transformer, a state-of-the-art
architecture for neural machine translation. In this work we evaluate the contribution made …
architecture for neural machine translation. In this work we evaluate the contribution made …
Mind the GAP: A balanced corpus of gendered ambiguous pronouns
Coreference resolution is an important task for natural language understanding, and the
resolution of ambiguous pronouns a longstanding challenge. Nonetheless, existing corpora …
resolution of ambiguous pronouns a longstanding challenge. Nonetheless, existing corpora …
The flores evaluation datasets for low-resource machine translation: Nepali-english and sinhala-english
For machine translation, a vast majority of language pairs in the world are considered low-
resource because they have little parallel data available. Besides the technical challenges …
resource because they have little parallel data available. Besides the technical challenges …
Cross-lingual transfer learning for multilingual task oriented dialog
One of the first steps in the utterance interpretation pipeline of many task-oriented
conversational AI systems is to identify user intents and the corresponding slots. Since data …
conversational AI systems is to identify user intents and the corresponding slots. Since data …
Context-aware neural machine translation learns anaphora resolution
Standard machine translation systems process sentences in isolation and hence ignore
extra-sentential information, even though extended context can both prevent mistakes in …
extra-sentential information, even though extended context can both prevent mistakes in …
When a good translation is wrong in context: Context-aware machine translation improves on deixis, ellipsis, and lexical cohesion
Though machine translation errors caused by the lack of context beyond one sentence have
long been acknowledged, the development of context-aware NMT systems is hampered by …
long been acknowledged, the development of context-aware NMT systems is hampered by …
Self-training improves pre-training for natural language understanding
Unsupervised pre-training has led to much recent progress in natural language
understanding. In this paper, we study self-training as another way to leverage unlabeled …
understanding. In this paper, we study self-training as another way to leverage unlabeled …