Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A comprehensive survey on pretrained foundation models: A history from bert to chatgpt
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
A comprehensive survey on deep graph representation learning
Graph representation learning aims to effectively encode high-dimensional sparse graph-
structured data into low-dimensional dense vectors, which is a fundamental task that has …
structured data into low-dimensional dense vectors, which is a fundamental task that has …
Impact of code language models on automated program repair
Automated program repair (APR) aims to help developers improve software reliability by
generating patches for buggy programs. Although many code language models (CLM) are …
generating patches for buggy programs. Although many code language models (CLM) are …
Talk like a graph: Encoding graphs for large language models
Graphs are a powerful tool for representing and analyzing complex relationships in real-
world applications such as social networks, recommender systems, and computational …
world applications such as social networks, recommender systems, and computational …
Sgformer: Simplifying and empowering transformers for large-graph representations
Learning representations on large-sized graphs is a long-standing challenge due to the inter-
dependence nature involved in massive data points. Transformers, as an emerging class of …
dependence nature involved in massive data points. Transformers, as an emerging class of …
Exphormer: Sparse transformers for graphs
Graph transformers have emerged as a promising architecture for a variety of graph learning
and representation tasks. Despite their successes, though, it remains challenging to scale …
and representation tasks. Despite their successes, though, it remains challenging to scale …
Structure-aware transformer for graph representation learning
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
Do transformers really perform badly for graph representation?
The Transformer architecture has become a dominant choice in many domains, such as
natural language processing and computer vision. Yet, it has not achieved competitive …
natural language processing and computer vision. Yet, it has not achieved competitive …
Data augmentation for deep graph learning: A survey
Graph neural networks, a powerful deep learning tool to model graph-structured data, have
demonstrated remarkable performance on numerous graph learning tasks. To address the …
demonstrated remarkable performance on numerous graph learning tasks. To address the …
Representing long-range context for graph neural networks with global attention
Graph neural networks are powerful architectures for structured datasets. However, current
methods struggle to represent long-range dependencies. Scaling the depth or width of …
methods struggle to represent long-range dependencies. Scaling the depth or width of …