Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Pre-trained language models for text generation: A survey
Text Generation aims to produce plausible and readable text in human language from input
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …
data. The resurgence of deep learning has greatly advanced this field, in particular, with the …
Ammus: A survey of transformer-based pretrained models in natural language processing
KS Kalyan, A Rajasekharan, S Sangeetha - arxiv preprint arxiv …, 2021 - arxiv.org
Transformer-based pretrained language models (T-PTLMs) have achieved great success in
almost every NLP task. The evolution of these models started with GPT and BERT. These …
almost every NLP task. The evolution of these models started with GPT and BERT. These …
NusaCrowd: Open source initiative for Indonesian NLP resources
We present NusaCrowd, a collaborative initiative to collect and unify existing resources for
Indonesian languages, including opening access to previously non-public resources …
Indonesian languages, including opening access to previously non-public resources …
A multitask, multilingual, multimodal evaluation of chatgpt on reasoning, hallucination, and interactivity
This paper proposes a framework for quantitatively evaluating interactive LLMs such as
ChatGPT using publicly available data sets. We carry out an extensive technical evaluation …
ChatGPT using publicly available data sets. We carry out an extensive technical evaluation …
Aya dataset: An open-access collection for multilingual instruction tuning
Datasets are foundational to many breakthroughs in modern artificial intelligence. Many
recent achievements in the space of natural language processing (NLP) can be attributed to …
recent achievements in the space of natural language processing (NLP) can be attributed to …
End-to-end transformer-based models in textual-based NLP
Transformer architectures are highly expressive because they use self-attention
mechanisms to encode long-range dependencies in the input sequences. In this paper, we …
mechanisms to encode long-range dependencies in the input sequences. In this paper, we …
Negative object presence evaluation (nope) to measure object hallucination in vision-language models
Object hallucination poses a significant challenge in vision-language (VL) models, often
leading to the generation of nonsensical or unfaithful responses with non-existent objects …
leading to the generation of nonsensical or unfaithful responses with non-existent objects …
Language models are few-shot multilingual learners
General-purpose language models have demonstrated impressive capabilities, performing
on par with state-of-the-art approaches on a range of downstream natural language …
on par with state-of-the-art approaches on a range of downstream natural language …
A systematic review of transformer-based pre-trained language models through self-supervised learning
Transfer learning is a technique utilized in deep learning applications to transmit learned
inference to a different target domain. The approach is mainly to solve the problem of a few …
inference to a different target domain. The approach is mainly to solve the problem of a few …
Chatgpt label: Comparing the quality of human-generated and llm-generated annotations in low-resource language nlp tasks
This research paper presents a comprehensive comparative study assessing the quality of
annotations in Turkish, Indonesian, and Minangkabau Natural Language Processing (NLP) …
annotations in Turkish, Indonesian, and Minangkabau Natural Language Processing (NLP) …