Revisiting modularity maximization for graph clustering: A contrastive learning perspective
Graph clustering, a fundamental and challenging task in graph mining, aims to classify
nodes in a graph into several disjoint clusters. In recent years, graph contrastive learning …
nodes in a graph into several disjoint clusters. In recent years, graph contrastive learning …
Contrastive tuning: A little help to make masked autoencoders forget
Masked Image Modeling (MIM) methods, like Masked Autoencoders (MAE), efficiently learn
a rich representation of the input. However, for adapting to downstream tasks, they require a …
a rich representation of the input. However, for adapting to downstream tasks, they require a …
Memorization in self-supervised learning improves downstream generalization
Self-supervised learning (SSL) has recently received significant attention due to its ability to
train high-performance encoders purely on unlabeled data-often scraped from the internet …
train high-performance encoders purely on unlabeled data-often scraped from the internet …
Cluster-aware semi-supervised learning: relational knowledge distillation provably learns clustering
Despite the empirical success and practical significance of (relational) knowledge distillation
that matches (the relations of) features between teacher and student models, the …
that matches (the relations of) features between teacher and student models, the …
A multi-view graph contrastive learning framework for deciphering spatially resolved transcriptomics data
Spatially resolved transcriptomics data are being used in a revolutionary way to decipher the
spatial pattern of gene expression and the spatial architecture of cell types. Much work has …
spatial pattern of gene expression and the spatial architecture of cell types. Much work has …
Cross-Domain Contrastive Learning for Time Series Clustering
F Peng, J Luo, X Lu, S Wang, F Li - … of the AAAI Conference on Artificial …, 2024 - ojs.aaai.org
Most deep learning-based time series clustering models concentrate on data representation
in a separate process from clustering. This leads to that clustering loss cannot guide feature …
in a separate process from clustering. This leads to that clustering loss cannot guide feature …
Exploiting Representation Curvature for Boundary Detection in Time Series
* Boundaries* are the timestamps at which a class in a time series changes. Recently,
representation-based boundary detection has gained popularity, but its emphasis on …
representation-based boundary detection has gained popularity, but its emphasis on …
Tight PAC-Bayesian Risk Certificates for Contrastive Learning
Contrastive representation learning is a modern paradigm for learning representations of
unlabeled data via augmentations--precisely, contrastive models learn to embed …
unlabeled data via augmentations--precisely, contrastive models learn to embed …
Towards Understanding the Mechanism of Contrastive Learning via Similarity Structure: A Theoretical Analysis
Contrastive learning is an efficient approach to self-supervised representation learning.
Although recent studies have made progress in the theoretical understanding of contrastive …
Although recent studies have made progress in the theoretical understanding of contrastive …
Contrastive Approach to Prior Free Positive Unlabeled Learning
Positive Unlabeled (PU) learning refers to the task of learning a binary classifier given a few
labeled positive samples, and a set of unlabeled samples (which could be positive or …
labeled positive samples, and a set of unlabeled samples (which could be positive or …