Conditional adapters: Parameter-efficient transfer learning with fast inference
Abstract We propose Conditional Adapter (CoDA), a parameter-efficient transfer learning
method that also improves inference efficiency. CoDA generalizes beyond standard adapter …
method that also improves inference efficiency. CoDA generalizes beyond standard adapter …
Differentiable transportation pruning
Deep learning algorithms are increasingly employed at the edge. However, edge devices
are resource constrained and thus require efficient deployment of deep neural networks …
are resource constrained and thus require efficient deployment of deep neural networks …
Advancing dynamic sparse training by exploring optimization opportunities
Dynamic Sparse Training (DST) is an effective approach for addressing the substantial
training resource requirements posed by the ever-increasing size of the Deep Neural …
training resource requirements posed by the ever-increasing size of the Deep Neural …
Is overfitting necessary for implicit video representation?
Compact representation of multimedia signals using implicit neural representations (INRs)
has advanced significantly over the past few years, and recent works address their …
has advanced significantly over the past few years, and recent works address their …
PETAH: Parameter Efficient Task Adaptation for Hybrid Transformers in a resource-limited Context
Following their success in natural language processing (NLP), there has been a shift
towards transformer models in computer vision. While transformers perform well and offer …
towards transformer models in computer vision. While transformers perform well and offer …
Mixed Sparsity Training: Achieving 4 FLOP Reduction for Transformer Pretraining
Large language models (LLMs) have made significant strides in complex tasks, yet their
widespread adoption is impeded by substantial computational demands. With hundreds of …
widespread adoption is impeded by substantial computational demands. With hundreds of …