Multi-task deep learning for medical image computing and analysis: A review

Y Zhao, X Wang, T Che, G Bao, S Li - Computers in Biology and Medicine, 2023 - Elsevier
The renaissance of deep learning has provided promising solutions to various tasks. While
conventional deep learning models are constructed for a single specific task, multi-task deep …

Deep Convolution Neural Network sharing for the multi-label images classification

S Coulibaly, B Kamsu-Foguem, D Kamissoko… - Machine learning with …, 2022 - Elsevier
Addressing issues related to multi-label classification is relevant in many fields of
applications. In this work. We present a multi-label classification architecture based on Multi …

Multi-task learning with deep neural networks: A survey

M Crawshaw - arxiv preprint arxiv:2009.09796, 2020 - arxiv.org
Multi-task learning (MTL) is a subfield of machine learning in which multiple tasks are
simultaneously learned by a shared model. Such approaches offer advantages like …

Inverted pyramid multi-task transformer for dense scene understanding

H Ye, D Xu - European Conference on Computer Vision, 2022 - Springer
Multi-task dense scene understanding is a thriving research domain that requires
simultaneous perception and reasoning on a series of correlated tasks with pixel-wise …

Real-world image super-resolution as multi-task learning

W Zhang, X Li, G Shi, X Chen, Y Qiao… - Advances in …, 2024 - proceedings.neurips.cc
In this paper, we take a new look at real-world image super-resolution (real-SR) from a multi-
task learning perspective. We demonstrate that the conventional formulation of real-SR can …

Taskexpert: Dynamically assembling multi-task representations with memorial mixture-of-experts

H Ye, D Xu - Proceedings of the IEEE/CVF International …, 2023 - openaccess.thecvf.com
Learning discriminative task-specific features simultaneously for multiple distinct tasks is a
fundamental problem in multi-task learning. Recent state-of-the-art models consider directly …

Mtformer: Multi-task learning via transformer and cross-task reasoning

X Xu, H Zhao, V Vineet, SN Lim, A Torralba - European Conference on …, 2022 - Springer
In this paper, we explore the advantages of utilizing transformer structures for addressing
multi-task learning (MTL). Specifically, we demonstrate that models with transformer …

Milenas: Efficient neural architecture search via mixed-level reformulation

C He, H Ye, L Shen, T Zhang - Proceedings of the IEEE …, 2020 - openaccess.thecvf.com
Many recently proposed methods for Neural Architecture Search (NAS) can be formulated
as bilevel optimization. For efficient implementation, its solution requires approximations of …

Auto-lambda: Disentangling dynamic task relationships

S Liu, S James, AJ Davison, E Johns - arxiv preprint arxiv:2202.03091, 2022 - arxiv.org
Understanding the structure of multiple related tasks allows for multi-task learning to improve
the generalisation ability of one or all of them. However, it usually requires training each …

Weight-sharing neural architecture search: A battle to shrink the optimization gap

L **e, X Chen, K Bi, L Wei, Y Xu, L Wang… - ACM Computing …, 2021 - dl.acm.org
Neural architecture search (NAS) has attracted increasing attention. In recent years,
individual search methods have been replaced by weight-sharing search methods for higher …