Efficient acceleration of deep learning inference on resource-constrained edge devices: A review
Successful integration of deep neural networks (DNNs) or deep learning (DL) has resulted
in breakthroughs in many areas. However, deploying these highly accurate models for data …
in breakthroughs in many areas. However, deploying these highly accurate models for data …
Knowledge distillation: A survey
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …
especially for computer vision tasks. The great success of deep learning is mainly due to its …
Edge intelligence: Architectures, challenges, and applications
Edge intelligence refers to a set of connected systems and devices for data collection,
caching, processing, and analysis in locations close to where data is captured based on …
caching, processing, and analysis in locations close to where data is captured based on …
Deep interest evolution network for click-through rate prediction
Click-through rate (CTR) prediction, whose goal is to estimate the probability of a user
clicking on the item, has become one of the core tasks in the advertising system. For CTR …
clicking on the item, has become one of the core tasks in the advertising system. For CTR …
Edge intelligence: Empowering intelligence to the edge of network
Edge intelligence refers to a set of connected systems and devices for data collection,
caching, processing, and analysis proximity to where data are captured based on artificial …
caching, processing, and analysis proximity to where data are captured based on artificial …
Knowledge distillation via instance relationship graph
The key challenge of knowledge distillation is to extract general, moderate and sufficient
knowledge from a teacher network to guide a student network. In this paper, a novel …
knowledge from a teacher network to guide a student network. In this paper, a novel …
Knowledge distillation via route constrained optimization
Distillation-based learning boosts the performance of the miniaturized neural network based
on the hypothesis that the representation of a teacher model can be used as structured and …
on the hypothesis that the representation of a teacher model can be used as structured and …
Low-resolution face recognition in the wild via selective knowledge distillation
Typically, the deployment of face recognition models in the wild needs to identify low-
resolution faces with extremely low computational cost. To address this problem, a feasible …
resolution faces with extremely low computational cost. To address this problem, a feasible …
Transforming large-size to lightweight deep neural networks for IoT applications
Deep Neural Networks (DNNs) have gained unprecedented popularity due to their high-
order performance and automated feature extraction capability. This has encouraged …
order performance and automated feature extraction capability. This has encouraged …
2D Human pose estimation: A survey
Human pose estimation aims at localizing human anatomical keypoints or body parts in the
input data (eg, images, videos, or signals). It forms a crucial component in enabling …
input data (eg, images, videos, or signals). It forms a crucial component in enabling …