A survey on deep neural network pruning: Taxonomy, comparison, analysis, and recommendations
Modern deep neural networks, particularly recent large language models, come with
massive model sizes that require significant computational and storage resources. To …
massive model sizes that require significant computational and storage resources. To …
Knowledge distillation on graphs: A survey
Graph Neural Networks (GNNs) have received significant attention for demonstrating their
capability to handle graph data. However, they are difficult to be deployed in resource …
capability to handle graph data. However, they are difficult to be deployed in resource …
Evaluating post-hoc explanations for graph neural networks via robustness analysis
This work studies the evaluation of explaining graph neural networks (GNNs), which is
crucial to the credibility of post-hoc explainability in practical usage. Conventional evaluation …
crucial to the credibility of post-hoc explainability in practical usage. Conventional evaluation …
Unleashing the power of graph data augmentation on covariate distribution shift
The issue of distribution shifts is emerging as a critical concern in graph representation
learning. From the perspective of invariant learning and stable learning, a recently well …
learning. From the perspective of invariant learning and stable learning, a recently well …
S2gae: Self-supervised graph autoencoders are generalizable learners with graph masking
Self-supervised learning (SSL) has been demonstrated to be effective in pre-training models
that can be generalized to various downstream tasks. Graph Autoencoder (GAE), an …
that can be generalized to various downstream tasks. Graph Autoencoder (GAE), an …
Handling distribution shifts on graphs: An invariance perspective
There is increasing evidence suggesting neural networks' sensitivity to distribution shifts, so
that research on out-of-distribution (OOD) generalization comes into the spotlight …
that research on out-of-distribution (OOD) generalization comes into the spotlight …
Chasing sparsity in vision transformers: An end-to-end exploration
Vision transformers (ViTs) have recently received explosive popularity, but their enormous
model sizes and training costs remain daunting. Conventional post-training pruning often …
model sizes and training costs remain daunting. Conventional post-training pruning often …
Structure-free graph condensation: From large-scale graphs to condensed graph-free data
Graph condensation, which reduces the size of a large-scale graph by synthesizing a small-
scale condensed graph as its substitution, has immediate benefits for various graph learning …
scale condensed graph as its substitution, has immediate benefits for various graph learning …
Trustworthy graph neural networks: Aspects, methods and trends
Graph neural networks (GNNs) have emerged as a series of competent graph learning
methods for diverse real-world scenarios, ranging from daily applications like …
methods for diverse real-world scenarios, ranging from daily applications like …
The lottery tickets hypothesis for supervised and self-supervised pre-training in computer vision models
The computer vision world has been re-gaining enthusiasm in various pre-trained models,
including both classical ImageNet supervised pre-training and recently emerged self …
including both classical ImageNet supervised pre-training and recently emerged self …