A comprehensive survey on pretrained foundation models: A history from bert to chatgpt
Abstract Pretrained Foundation Models (PFMs) are regarded as the foundation for various
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
downstream tasks across different data modalities. A PFM (eg, BERT, ChatGPT, GPT-4) is …
A Survey on Self-supervised Learning: Algorithms, Applications, and Future Trends
Deep supervised learning algorithms typically require a large volume of labeled data to
achieve satisfactory performance. However, the process of collecting and labeling such data …
achieve satisfactory performance. However, the process of collecting and labeling such data …
Dinov2: Learning robust visual features without supervision
The recent breakthroughs in natural language processing for model pretraining on large
quantities of data have opened the way for similar foundation models in computer vision …
quantities of data have opened the way for similar foundation models in computer vision …
Emerging properties in self-supervised vision transformers
In this paper, we question if self-supervised learning provides new properties to Vision
Transformer (ViT) that stand out compared to convolutional networks (convnets). Beyond the …
Transformer (ViT) that stand out compared to convolutional networks (convnets). Beyond the …
Barlow twins: Self-supervised learning via redundancy reduction
Self-supervised learning (SSL) is rapidly closing the gap with supervised methods on large
computer vision benchmarks. A successful approach to SSL is to learn embeddings which …
computer vision benchmarks. A successful approach to SSL is to learn embeddings which …
A survey on contrastive self-supervised learning
Self-supervised learning has gained popularity because of its ability to avoid the cost of
annotating large-scale datasets. It is capable of adopting self-defined pseudolabels as …
annotating large-scale datasets. It is capable of adopting self-defined pseudolabels as …
A unifying review of deep and shallow anomaly detection
Deep learning approaches to anomaly detection (AD) have recently improved the state of
the art in detection performance on complex data sets, such as large collections of images or …
the art in detection performance on complex data sets, such as large collections of images or …
Unsupervised learning of visual features by contrasting cluster assignments
Unsupervised image representations have significantly reduced the gap with supervised
pretraining, notably with the recent achievements of contrastive learning methods. These …
pretraining, notably with the recent achievements of contrastive learning methods. These …
Understanding contrastive representation learning through alignment and uniformity on the hypersphere
Contrastive representation learning has been outstandingly successful in practice. In this
work, we identify two key properties related to the contrastive loss:(1) alignment (closeness) …
work, we identify two key properties related to the contrastive loss:(1) alignment (closeness) …
Vertical federated learning: Concepts, advances, and challenges
Vertical Federated Learning (VFL) is a federated learning setting where multiple parties with
different features about the same set of users jointly train machine learning models without …
different features about the same set of users jointly train machine learning models without …