Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A comprehensive survey on pretrained foundation models: A history from bert to chatgpt
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
Machine learning methods for small data challenges in molecular science
Small data are often used in scientific and engineering research due to the presence of
various constraints, such as time, cost, ethics, privacy, security, and technical limitations in …
various constraints, such as time, cost, ethics, privacy, security, and technical limitations in …
A survey on self-supervised learning: Algorithms, applications, and future trends
Deep supervised learning algorithms typically require a large volume of labeled data to
achieve satisfactory performance. However, the process of collecting and labeling such data …
achieve satisfactory performance. However, the process of collecting and labeling such data …
Language is all a graph needs
The emergence of large-scale pre-trained language models has revolutionized various AI
research domains. Transformers-based Large Language Models (LLMs) have gradually …
research domains. Transformers-based Large Language Models (LLMs) have gradually …
Structure-aware transformer for graph representation learning
The Transformer architecture has gained growing attention in graph representation learning
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
recently, as it naturally overcomes several limitations of graph neural networks (GNNs) by …
Uni-mol: A universal 3d molecular representation learning framework
Molecular representation learning (MRL) has gained tremendous attention due to its critical
role in learning from limited supervised data for applications like drug design. In most MRL …
role in learning from limited supervised data for applications like drug design. In most MRL …
Graph neural networks: foundation, frontiers and applications
The field of graph neural networks (GNNs) has seen rapid and incredible strides over the
recent years. Graph neural networks, also known as deep learning on graphs, graph …
recent years. Graph neural networks, also known as deep learning on graphs, graph …
Pure transformers are powerful graph learners
We show that standard Transformers without graph-specific modifications can lead to
promising results in graph learning both in theory and practice. Given a graph, we simply …
promising results in graph learning both in theory and practice. Given a graph, we simply …
Grammar prompting for domain-specific language generation with large language models
Large language models (LLMs) can learn to perform a wide range of natural language tasks
from just a handful of in-context examples. However, for generating strings from highly …
from just a handful of in-context examples. However, for generating strings from highly …
Simgrace: A simple framework for graph contrastive learning without data augmentation
Graph contrastive learning (GCL) has emerged as a dominant technique for graph
representation learning which maximizes the mutual information between paired graph …
representation learning which maximizes the mutual information between paired graph …