Obow: Online bag-of-visual-words generation for self-supervised learning
Learning image representations without human supervision is an important and active
research field. Several recent approaches have successfully leveraged the idea of making …
research field. Several recent approaches have successfully leveraged the idea of making …
Few-shot object detection by knowledge distillation using bag-of-visual-words representations
While fine-tuning based methods for few-shot object detection have achieved remarkable
progress, a crucial challenge that has not been addressed well is the potential class-specific …
progress, a crucial challenge that has not been addressed well is the potential class-specific …
Toward Cross-Lingual Social Event Detection with Hybrid Knowledge Distillation
Recently published graph neural networks (GNNs) show promising performance at social
event detection tasks. However, most studies are oriented toward monolingual data in …
event detection tasks. However, most studies are oriented toward monolingual data in …
Knowledge distillation meets open-set semi-supervised learning
Existing knowledge distillation methods mostly focus on distillation of teacher's prediction
and intermediate activation. However, the structured representation, which arguably is one …
and intermediate activation. However, the structured representation, which arguably is one …
Knowledge Distillation Layer that Lets the Student Decide
Typical technique in knowledge distillation (KD) is regularizing the learning of a limited
capacity model (student) by pushing its responses to match a powerful model's (teacher) …
capacity model (student) by pushing its responses to match a powerful model's (teacher) …
[PDF][PDF] FLRKD: Relational Knowledge Distillation Based on Channel-wise Feature Quality Assessment.
Z An, C Deng, W Dang, Z Dong, Q Luo, J Cheng - BMVC, 2023 - papers.bmvc2023.org
With the increasing computational power of computing devices, the pre-training of large
deep-learning models has become prevalent. However, deploying such models on edge …
deep-learning models has become prevalent. However, deploying such models on edge …
FA-GAL-ResNet: Lightweight Residual Network using Focused Attention Mechanism and Generative Adversarial Learning via Knowledge Distillation
Despite that deep neural networks have achieved satisfactory performance, they rely on
powerful hardware for training, which is expensive and not easy to get. Therefore, the …
powerful hardware for training, which is expensive and not easy to get. Therefore, the …
Deep face recognition in the wild
J Yang - 2022 - eprints.nottingham.ac.uk
Face recognition has attracted particular interest in biometric recognition with wide
applications in security, entertainment, health, marketing. Recent years have witnessed …
applications in security, entertainment, health, marketing. Recent years have witnessed …
Knowledge Distillation By Sparse Representation Matching
Knowledge Distillation refers to a class of methods that transfers the knowledge from a
teacher network to a student network. In this paper, we propose Sparse Representation …
teacher network to a student network. In this paper, we propose Sparse Representation …