Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Natural language generation and understanding of big code for AI-assisted programming: A review
MF Wong, S Guo, CN Hang, SW Ho, CW Tan - Entropy, 2023 - mdpi.com
This paper provides a comprehensive review of the literature concerning the utilization of
Natural Language Processing (NLP) techniques, with a particular focus on transformer …
Natural Language Processing (NLP) techniques, with a particular focus on transformer …
A survey on deep graph generation: Methods and applications
Graphs are ubiquitous in encoding relational information of real-world objects in many
domains. Graph generation, whose purpose is to generate new graphs from a distribution …
domains. Graph generation, whose purpose is to generate new graphs from a distribution …
Crosslingual generalization through multitask finetuning
Multitask prompted finetuning (MTF) has been shown to help large language models
generalize to new tasks in a zero-shot setting, but so far explorations of MTF have focused …
generalize to new tasks in a zero-shot setting, but so far explorations of MTF have focused …
An empirical evaluation of using large language models for automated unit test generation
Unit tests play a key role in ensuring the correctness of software. However, manually
creating unit tests is a laborious task, motivating the need for automation. Large Language …
creating unit tests is a laborious task, motivating the need for automation. Large Language …
Octopack: Instruction tuning code large language models
Finetuning large language models (LLMs) on instructions leads to vast performance
improvements on natural language tasks. We apply instruction tuning using code …
improvements on natural language tasks. We apply instruction tuning using code …
Aya model: An instruction finetuned open-access multilingual language model
Recent breakthroughs in large language models (LLMs) have centered around a handful of
data-rich languages. What does it take to broaden access to breakthroughs beyond first …
data-rich languages. What does it take to broaden access to breakthroughs beyond first …
Graph neural networks: foundation, frontiers and applications
The field of graph neural networks (GNNs) has seen rapid and incredible strides over the
recent years. Graph neural networks, also known as deep learning on graphs, graph …
recent years. Graph neural networks, also known as deep learning on graphs, graph …
Program synthesis with large language models
This paper explores the limits of the current generation of large language models for
program synthesis in general purpose programming languages. We evaluate a collection of …
program synthesis in general purpose programming languages. We evaluate a collection of …
Do transformers really perform badly for graph representation?
The Transformer architecture has become a dominant choice in many domains, such as
natural language processing and computer vision. Yet, it has not achieved competitive …
natural language processing and computer vision. Yet, it has not achieved competitive …
Codexglue: A machine learning benchmark dataset for code understanding and generation
Benchmark datasets have a significant impact on accelerating research in programming
language tasks. In this paper, we introduce CodeXGLUE, a benchmark dataset to foster …
language tasks. In this paper, we introduce CodeXGLUE, a benchmark dataset to foster …