Pro-KD: Progressive distillation by following the footsteps of the teacher
With ever growing scale of neural models, knowledge distillation (KD) attracts more attention
as a prominent tool for neural model compression. However, there are counter intuitive …
as a prominent tool for neural model compression. However, there are counter intuitive …
Image Caption Enhancement with GRIT, Portable ResNet and BART Context-Tuning
W Zhang, J Ma - 2022 6th International Conference on …, 2022 - ieeexplore.ieee.org
This paper aims to create an image captioning novel architecture that infuses Grid and
Region-based image caption transformer, ResNet, and BART language model to offer a …
Region-based image caption transformer, ResNet, and BART language model to offer a …