Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
[PDF][PDF] Language model behavior: A comprehensive survey
Transformer language models have received widespread public attention, yet their
generated text is often surprising even to NLP researchers. In this survey, we discuss over …
generated text is often surprising even to NLP researchers. In this survey, we discuss over …
Analysis methods in neural language processing: A survey
Y Belinkov, J Glass - … of the Association for Computational Linguistics, 2019 - direct.mit.edu
The field of natural language processing has seen impressive progress in recent years, with
neural network models replacing many of the traditional systems. A plethora of new models …
neural network models replacing many of the traditional systems. A plethora of new models …
Beyond the imitation game: Quantifying and extrapolating the capabilities of language models
Language models demonstrate both quantitative improvement and new qualitative
capabilities with increasing scale. Despite their potentially transformative impact, these new …
capabilities with increasing scale. Despite their potentially transformative impact, these new …
What artificial neural networks can tell us about human language acquisition
A Warstadt, SR Bowman - Algebraic structures in natural …, 2022 - taylorfrancis.com
Rapid progress in machine learning for natural language processing has the potential to
transform debates about how humans learn language. However, the learning environments …
transform debates about how humans learn language. However, the learning environments …
Right for the wrong reasons: Diagnosing syntactic heuristics in natural language inference
A machine learning system can score well on a given test set by relying on heuristics that
are effective for frequent example types but break down in more challenging cases. We …
are effective for frequent example types but break down in more challenging cases. We …
Syntactic structure from deep learning
Modern deep neural networks achieve impressive performance in engineering applications
that require extensive linguistic skills, such as machine translation. This success has …
that require extensive linguistic skills, such as machine translation. This success has …
Open sesame: Getting inside BERT's linguistic knowledge
How and to what extent does BERT encode syntactically-sensitive hierarchical information
or positionally-sensitive linear information? Recent work has shown that contextual …
or positionally-sensitive linear information? Recent work has shown that contextual …
What do RNN language models learn about filler-gap dependencies?
RNN language models have achieved state-of-the-art perplexity results and have proven
useful in a suite of NLP tasks, but it is as yet unclear what syntactic generalizations they …
useful in a suite of NLP tasks, but it is as yet unclear what syntactic generalizations they …
Learning which features matter: RoBERTa acquires a preference for linguistic generalizations (eventually)
One reason pretraining on self-supervised linguistic tasks is effective is that it teaches
models features that are helpful for language understanding. However, we want pretrained …
models features that are helpful for language understanding. However, we want pretrained …
BERTs of a feather do not generalize together: Large variability in generalization across models with similar test set performance
If the same neural network architecture is trained multiple times on the same dataset, will it
make similar linguistic generalizations across runs? To study this question, we fine-tuned …
make similar linguistic generalizations across runs? To study this question, we fine-tuned …