Spatial transcriptomics prediction from histology jointly through transformer and graph neural networks

Y Zeng, Z Wei, W Yu, R Yin, Y Yuan, B Li… - Briefings in …, 2022 - academic.oup.com
The rapid development of spatial transcriptomics allows the measurement of RNA
abundance at a high spatial resolution, making it possible to simultaneously profile gene …

M3AE: Multimodal representation learning for brain tumor segmentation with missing modalities

H Liu, D Wei, D Lu, J Sun, L Wang… - Proceedings of the AAAI …, 2023 - ojs.aaai.org
Multimodal magnetic resonance imaging (MRI) provides complementary information for sub-
region analysis of brain tumors. Plenty of methods have been proposed for automatic brain …

Categories of response-based, feature-based, and relation-based knowledge distillation

C Yang, X Yu, Z An, Y Xu - … in knowledge distillation: towards new horizons …, 2023 - Springer
Deep neural networks have achieved remarkable performance for artificial intelligence
tasks. The success behind intelligent systems often relies on large-scale models with high …

Learning from yourself: A self-distillation method for fake speech detection

J Xue, C Fan, J Yi, C Wang, Z Wen… - ICASSP 2023-2023 …, 2023 - ieeexplore.ieee.org
In this paper, we propose a novel self-distillation method for fake speech detection (FSD),
which can significantly improve the performance of FSD without increasing the model …

Efficient Deep Learning Infrastructures for Embedded Computing Systems: A Comprehensive Survey and Future Envision

X Luo, D Liu, H Kong, S Huai, H Chen… - ACM Transactions on …, 2024 - dl.acm.org
Deep neural networks (DNNs) have recently achieved impressive success across a wide
range of real-world vision and language processing tasks, spanning from image …

[HTML][HTML] A Survey on Knowledge Distillation: Recent Advancements

A Moslemi, A Briskina, Z Dang, J Li - Machine Learning with Applications, 2024 - Elsevier
Deep learning has achieved notable success across academia, medicine, and industry. Its
ability to identify complex patterns in large-scale data and to manage millions of parameters …

Ccsd: cross-camera self-distillation for unsupervised person re-identification

J Chen, C Gao, L Sun, N Sang - Visual Intelligence, 2023 - Springer
Existing unsupervised person re-identification (Re-ID) methods have achieved remarkable
performance by adopting an alternate clustering-training manner. However, they still suffer …

Self-distillation and self-supervision for partial label learning

X Yu, S Sun, Y Tian - Pattern Recognition, 2024 - Elsevier
As a main branch of weakly supervised learning paradigm, partial label learning (PLL)
copes with the situation where each sample corresponds to ambiguous candidate labels …

Self-knowledge distillation based self-supervised learning for covid-19 detection from chest x-ray images

G Li, R Togo, T Ogawa… - ICASSP 2022-2022 IEEE …, 2022 - ieeexplore.ieee.org
The global outbreak of the Coronavirus 2019 (COVID-19) has overloaded worldwide
healthcare systems. Computer-aided diagnosis for COVID-19 fast detection and patient …

Target-embedding autoencoder with knowledge distillation for multi-label classification

Y Ma, X Zou, Q Pan, M Yan, G Li - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
In the task of multi-label classification, it is a key challenge to determine the correlation
between labels. One solution to this is the Target Embedding Autoencoder (TEA), but most …