Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Neural unsupervised domain adaptation in NLP---a survey
Deep neural networks excel at learning from labeled data and achieve state-of-the-art
resultson a wide array of Natural Language Processing tasks. In contrast, learning from …
resultson a wide array of Natural Language Processing tasks. In contrast, learning from …
A survey of syntactic-semantic parsing based on constituent and dependency structures
M Zhang - Science China Technological Sciences, 2020 - Springer
Syntactic and semantic parsing has been investigated for decades, which is one primary
topic in the natural language processing community. This article aims for a brief survey on …
topic in the natural language processing community. This article aims for a brief survey on …
Llm-assisted data augmentation for chinese dialogue-level dependency parsing
Dialogue-level dependency parsing, despite its growing academic interest, often encounters
underperformance issues due to resource shortages. A potential solution to this challenge is …
underperformance issues due to resource shortages. A potential solution to this challenge is …
Rethinking semi-supervised learning with language models
Semi-supervised learning (SSL) is a popular setting aiming to effectively utilize unlabelled
data to improve model performance in downstream natural language processing (NLP) …
data to improve model performance in downstream natural language processing (NLP) …
NaSGEC: a multi-domain Chinese grammatical error correction dataset from native speaker texts
We introduce NaSGEC, a new dataset to facilitate research on Chinese grammatical error
correction (CGEC) for native speaker texts from multiple domains. Previous CGEC research …
correction (CGEC) for native speaker texts from multiple domains. Previous CGEC research …
Unsupervised domain adaptation with adapter
Unsupervised domain adaptation (UDA) with pre-trained language models (PrLM) has
achieved promising results since these pre-trained models embed generic knowledge …
achieved promising results since these pre-trained models embed generic knowledge …
Challenges to open-domain constituency parsing
Neural constituency parsers have reached practical performance on news-domain
benchmarks. However, their generalization ability to other domains remains weak. Existing …
benchmarks. However, their generalization ability to other domains remains weak. Existing …
Dependency parsing via sequence generation
Dependency parsing aims to extract syntactic dependency structure or semantic
dependency structure for sentences. Existing methods for dependency parsing include …
dependency structure for sentences. Existing methods for dependency parsing include …
Semi-supervised reward modeling via iterative self-training
Reward models (RM) capture the values and preferences of humans and play a central role
in Reinforcement Learning with Human Feedback (RLHF) to align pretrained large …
in Reinforcement Learning with Human Feedback (RLHF) to align pretrained large …
Semi-supervised domain adaptation for dependency parsing via improved contextualized word representations
In recent years, parsing performance is dramatically improved on in-domain texts thanks to
the rapid progress of deep neural network models. The major challenge for current parsing …
the rapid progress of deep neural network models. The major challenge for current parsing …