Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Mmicl: Empowering vision-language model with multi-modal in-context learning
Starting from the resurgence of deep learning, vision-language models (VLMs) benefiting
from large language models (LLMs) have never been so popular. However, while LLMs can …
from large language models (LLMs) have never been so popular. However, while LLMs can …
Improving event definition following for zero-shot event detection
Existing approaches on zero-shot event detection usually train models on datasets
annotated with known event types, and prompt them with unseen event definitions. These …
annotated with known event types, and prompt them with unseen event definitions. These …
Self-distillation with meta learning for knowledge graph completion
In this paper, we propose a selfdistillation framework with meta learning (MetaSD) for
knowledge graph completion with dynamic pruning, which aims to learn compressed graph …
knowledge graph completion with dynamic pruning, which aims to learn compressed graph …
A survey on learning with noisy labels in Natural Language Processing: How to train models with label noise
H Zhang, Y Zhang, J Li, J Liu, L Ji - Engineering Applications of Artificial …, 2025 - Elsevier
When applying deep neural network language models to related systems (eg, question
answering systems, chatbots, and intelligent assistants), many datasets contain different …
answering systems, chatbots, and intelligent assistants), many datasets contain different …
Mitigating Language-Level Performance Disparity in mPLMs via Teacher Language Selection and Cross-lingual Self-Distillation
Large-scale multilingual Pretrained Language Models (mPLMs) yield impressive
performance on cross-language tasks, yet significant performance disparities exist across …
performance on cross-language tasks, yet significant performance disparities exist across …
Aligning Large Language Models to Follow Instructions and Hallucinate Less via Effective Data Filtering
Training LLMs on data that contains unfamiliar knowledge during the instruction tuning
stage can make LLMs overconfident and encourage hallucinations. To address this …
stage can make LLMs overconfident and encourage hallucinations. To address this …
Improving the Robustness of Distantly-Supervised Named Entity Recognition via Uncertainty-Aware Teacher Learning and Student-Student Collaborative Learning
Distantly-Supervised Named Entity Recognition (DS-NER) is widely used in real-world
scenarios. It can effectively alleviate the burden of annotation by matching entities in existing …
scenarios. It can effectively alleviate the burden of annotation by matching entities in existing …