A brief review on multi-task learning
KH Thung, CY Wee - Multimedia Tools and Applications, 2018 - Springer
Abstract Multi-task learning (MTL), which optimizes multiple related learning tasks at the
same time, has been widely used in various applications, including natural language …
same time, has been widely used in various applications, including natural language …
Transfer learning for speech and language processing
Transfer learning is a vital technique that generalizes models trained for one setting or task
to other settings or tasks. For example in speech recognition, an acoustic model trained for …
to other settings or tasks. For example in speech recognition, an acoustic model trained for …
Cross-stitch networks for multi-task learning
Multi-task learning in Convolutional Networks has displayed remarkable success in the field
of recognition. This success can be largely attributed to learning shared representations …
of recognition. This success can be largely attributed to learning shared representations …
One model to learn them all
Deep learning yields great results across many fields, from speech recognition, image
classification, to translation. But for each problem, getting a deep model to work well …
classification, to translation. But for each problem, getting a deep model to work well …
Automatic analysis of facial affect: A survey of registration, representation, and recognition
Automatic affect analysis has attracted great interest in various contexts including the
recognition of action units and basic or non-basic emotions. In spite of major efforts, there …
recognition of action units and basic or non-basic emotions. In spite of major efforts, there …
Auxiliary tasks in multi-task learning
Multi-task convolutional neural networks (CNNs) have shown impressive results for certain
combinations of tasks, such as single-image depth estimation (SIDE) and semantic …
combinations of tasks, such as single-image depth estimation (SIDE) and semantic …
Invariant models for causal transfer learning
Methods of transfer learning try to combine knowledge from several related tasks (or
domains) to improve performance on a test task. Inspired by causal methodology, we relax …
domains) to improve performance on a test task. Inspired by causal methodology, we relax …
Multi-task learning for natural language processing in the 2020s: Where are we going?
Multi-task learning (MTL) significantly pre-dates the deep learning era, and it has seen a
resurgence in the past few years as researchers have been applying MTL to deep learning …
resurgence in the past few years as researchers have been applying MTL to deep learning …
From depth what can you see? Depth completion via auxiliary image reconstruction
Depth completion recovers dense depth from sparse measurements, eg, LiDAR. Existing
depth-only methods use sparse depth as the only input. However, these methods may fail to …
depth-only methods use sparse depth as the only input. However, these methods may fail to …
Multilinear multitask learning
Many real world datasets occur or can be arranged into multi-modal structures. With such
datasets, the tasks to be learnt can be referenced by multiple indices. Current multitask …
datasets, the tasks to be learnt can be referenced by multiple indices. Current multitask …