Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
To compress or not to compress—self-supervised learning and information theory: A review
Deep neural networks excel in supervised learning tasks but are constrained by the need for
extensive labeled data. Self-supervised learning emerges as a promising alternative …
extensive labeled data. Self-supervised learning emerges as a promising alternative …
Generalization bounds: Perspectives from information theory and PAC-Bayes
A fundamental question in theoretical machine learning is generalization. Over the past
decades, the PAC-Bayesian approach has been established as a flexible framework to …
decades, the PAC-Bayesian approach has been established as a flexible framework to …
Towards tracing trustworthiness dynamics: Revisiting pre-training period of large language models
Ensuring the trustworthiness of large language models (LLMs) is crucial. Most studies
concentrate on fully pre-trained LLMs to better understand and improve LLMs' …
concentrate on fully pre-trained LLMs to better understand and improve LLMs' …
Examining and combating spurious features under distribution shift
A central goal of machine learning is to learn robust representations that capture the
fundamental relationship between inputs and output labels. However, minimizing training …
fundamental relationship between inputs and output labels. However, minimizing training …
A measure of the complexity of neural representations based on partial information decomposition
In neural networks, task-relevant information is represented jointly by groups of neurons.
However, the specific way in which this mutual information about the classification label is …
However, the specific way in which this mutual information about the classification label is …
Using sliced mutual information to study memorization and generalization in deep neural networks
In this paper, we study the memorization and generalization behaviour of deep neural
networks (DNNs) using sliced mutual information (SMI), which is the average of the mutual …
networks (DNNs) using sliced mutual information (SMI), which is the average of the mutual …
Information flow in deep neural networks
Although deep neural networks have been immensely successful, there is no
comprehensive theoretical understanding of how they work or are structured. As a result …
comprehensive theoretical understanding of how they work or are structured. As a result …
Performance evaluation of deep learning models for image classification over small datasets: Diabetic foot case study
Data scarcity is a common and challenging issue when working with Artificial Intelligence
solutions, especially those including Deep Learning (DL) models for tasks such as image …
solutions, especially those including Deep Learning (DL) models for tasks such as image …
Fault detection using generalized autoencoder with neighborhood restriction for electrical drive systems of high-speed trains
Over the past two decades, fault detection of high-speed trains has become an active issue
in the transportation area. Recent work has demonstrated the benefits of autoencoder for …
in the transportation area. Recent work has demonstrated the benefits of autoencoder for …
Minimum description length and generalization guarantees for representation learning
A major challenge in designing efficient statistical supervised learning algorithms is finding
representations that perform well not only on available training samples but also on unseen …
representations that perform well not only on available training samples but also on unseen …