Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
[HTML][HTML] A survey of GPT-3 family large language models including ChatGPT and GPT-4
KS Kalyan - Natural Language Processing Journal, 2024 - Elsevier
Large language models (LLMs) are a special class of pretrained language models (PLMs)
obtained by scaling model size, pretraining corpus and computation. LLMs, because of their …
obtained by scaling model size, pretraining corpus and computation. LLMs, because of their …
Machine learning methods for small data challenges in molecular science
Small data are often used in scientific and engineering research due to the presence of
various constraints, such as time, cost, ethics, privacy, security, and technical limitations in …
various constraints, such as time, cost, ethics, privacy, security, and technical limitations in …
Imagereward: Learning and evaluating human preferences for text-to-image generation
We present a comprehensive solution to learn and improve text-to-image models from
human preference feedback. To begin with, we build ImageReward---the first general …
human preference feedback. To begin with, we build ImageReward---the first general …
A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications
Data scarcity is a major challenge when training deep learning (DL) models. DL demands a
large amount of data to achieve exceptional performance. Unfortunately, many applications …
large amount of data to achieve exceptional performance. Unfortunately, many applications …
[HTML][HTML] A comparison review of transfer learning and self-supervised learning: Definitions, applications, advantages and limitations
Deep learning has emerged as a powerful tool in various domains, revolutionising machine
learning research. However, one persistent challenge is the scarcity of labelled training …
learning research. However, one persistent challenge is the scarcity of labelled training …
Glm-130b: An open bilingual pre-trained model
We introduce GLM-130B, a bilingual (English and Chinese) pre-trained language model
with 130 billion parameters. It is an attempt to open-source a 100B-scale model at least as …
with 130 billion parameters. It is an attempt to open-source a 100B-scale model at least as …
A survey on deep neural network pruning: Taxonomy, comparison, analysis, and recommendations
Modern deep neural networks, particularly recent large language models, come with
massive model sizes that require significant computational and storage resources. To …
massive model sizes that require significant computational and storage resources. To …
Graphmae: Self-supervised masked graph autoencoders
Self-supervised learning (SSL) has been extensively explored in recent years. Particularly,
generative SSL has seen emerging success in natural language processing and other …
generative SSL has seen emerging success in natural language processing and other …
Machine learning for a sustainable energy future
Transitioning from fossil fuels to renewable energy sources is a critical global challenge; it
demands advances—at the materials, devices and systems levels—for the efficient …
demands advances—at the materials, devices and systems levels—for the efficient …
Agentbench: Evaluating llms as agents
Large Language Models (LLMs) are becoming increasingly smart and autonomous,
targeting real-world pragmatic missions beyond traditional NLP tasks. As a result, there has …
targeting real-world pragmatic missions beyond traditional NLP tasks. As a result, there has …