A full dive into realizing the edge-enabled metaverse: Visions, enabling technologies, and challenges
Dubbed “the successor to the mobile Internet,” the concept of the Metaverse has grown in
popularity. While there exist lite versions of the Metaverse today, they are still far from …
popularity. While there exist lite versions of the Metaverse today, they are still far from …
Dataset distillation: A comprehensive review
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …
training deep neural networks. Despite the unprecedented success, the massive data …
Camel: Communicative agents for" mind" exploration of large language model society
The rapid advancement of chat-based language models has led to remarkable progress in
complex task-solving. However, their success heavily relies on human input to guide the …
complex task-solving. However, their success heavily relies on human input to guide the …
Decoupled knowledge distillation
State-of-the-art distillation methods are mainly based on distilling deep features from
intermediate layers, while the significance of logit distillation is greatly overlooked. To …
intermediate layers, while the significance of logit distillation is greatly overlooked. To …
Point-to-voxel knowledge distillation for lidar semantic segmentation
This article addresses the problem of distilling knowledge from a large teacher model to a
slim student network for LiDAR semantic segmentation. Directly employing previous …
slim student network for LiDAR semantic segmentation. Directly employing previous …
Knowledge distillation from a stronger teacher
Unlike existing knowledge distillation methods focus on the baseline settings, where the
teacher models and training strategies are not that strong and competing as state-of-the-art …
teacher models and training strategies are not that strong and competing as state-of-the-art …
Class-incremental learning by knowledge distillation with adaptive feature consolidation
We present a novel class incremental learning approach based on deep neural networks,
which continually learns new tasks with limited memory for storing examples in the previous …
which continually learns new tasks with limited memory for storing examples in the previous …
Going deeper with image transformers
Transformers have been recently adapted for large scale image classification, achieving
high scores shaking up the long supremacy of convolutional neural networks. However the …
high scores shaking up the long supremacy of convolutional neural networks. However the …
A survey of quantization methods for efficient neural network inference
This chapter provides approaches to the problem of quantizing the numerical values in deep
Neural Network computations, covering the advantages/disadvantages of current methods …
Neural Network computations, covering the advantages/disadvantages of current methods …
Factorizing knowledge in neural networks
In this paper, we explore a novel and ambitious knowledge-transfer task, termed Knowledge
Factorization (KF). The core idea of KF lies in the modularization and assemblability of …
Factorization (KF). The core idea of KF lies in the modularization and assemblability of …