A full dive into realizing the edge-enabled metaverse: Visions, enabling technologies, and challenges

M Xu, WC Ng, WYB Lim, J Kang, Z **ong… - … Surveys & Tutorials, 2022 - ieeexplore.ieee.org
Dubbed “the successor to the mobile Internet,” the concept of the Metaverse has grown in
popularity. While there exist lite versions of the Metaverse today, they are still far from …

Dataset distillation: A comprehensive review

R Yu, S Liu, X Wang - IEEE Transactions on Pattern Analysis …, 2023 - ieeexplore.ieee.org
Recent success of deep learning is largely attributed to the sheer amount of data used for
training deep neural networks. Despite the unprecedented success, the massive data …

Camel: Communicative agents for" mind" exploration of large language model society

G Li, H Hammoud, H Itani… - Advances in Neural …, 2023 - proceedings.neurips.cc
The rapid advancement of chat-based language models has led to remarkable progress in
complex task-solving. However, their success heavily relies on human input to guide the …

Decoupled knowledge distillation

B Zhao, Q Cui, R Song, Y Qiu… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
State-of-the-art distillation methods are mainly based on distilling deep features from
intermediate layers, while the significance of logit distillation is greatly overlooked. To …

Point-to-voxel knowledge distillation for lidar semantic segmentation

Y Hou, X Zhu, Y Ma, CC Loy… - Proceedings of the IEEE …, 2022 - openaccess.thecvf.com
This article addresses the problem of distilling knowledge from a large teacher model to a
slim student network for LiDAR semantic segmentation. Directly employing previous …

Knowledge distillation from a stronger teacher

T Huang, S You, F Wang, C Qian… - Advances in Neural …, 2022 - proceedings.neurips.cc
Unlike existing knowledge distillation methods focus on the baseline settings, where the
teacher models and training strategies are not that strong and competing as state-of-the-art …

Class-incremental learning by knowledge distillation with adaptive feature consolidation

M Kang, J Park, B Han - … of the IEEE/CVF conference on …, 2022 - openaccess.thecvf.com
We present a novel class incremental learning approach based on deep neural networks,
which continually learns new tasks with limited memory for storing examples in the previous …

Going deeper with image transformers

H Touvron, M Cord, A Sablayrolles… - Proceedings of the …, 2021 - openaccess.thecvf.com
Transformers have been recently adapted for large scale image classification, achieving
high scores shaking up the long supremacy of convolutional neural networks. However the …

A survey of quantization methods for efficient neural network inference

A Gholami, S Kim, Z Dong, Z Yao… - Low-Power Computer …, 2022 - taylorfrancis.com
This chapter provides approaches to the problem of quantizing the numerical values in deep
Neural Network computations, covering the advantages/disadvantages of current methods …

Factorizing knowledge in neural networks

X Yang, J Ye, X Wang - European Conference on Computer Vision, 2022 - Springer
In this paper, we explore a novel and ambitious knowledge-transfer task, termed Knowledge
Factorization (KF). The core idea of KF lies in the modularization and assemblability of …