Advances and challenges in meta-learning: A technical review
Meta-learning empowers learning systems with the ability to acquire knowledge from
multiple tasks, enabling faster adaptation and generalization to new tasks. This review …
multiple tasks, enabling faster adaptation and generalization to new tasks. This review …
Meta-learning in neural networks: A survey
The field of meta-learning, or learning-to-learn, has seen a dramatic rise in interest in recent
years. Contrary to conventional approaches to AI where tasks are solved from scratch using …
years. Contrary to conventional approaches to AI where tasks are solved from scratch using …
Weight-sharing neural architecture search: A battle to shrink the optimization gap
Neural architecture search (NAS) has attracted increasing attention. In recent years,
individual search methods have been replaced by weight-sharing search methods for higher …
individual search methods have been replaced by weight-sharing search methods for higher …
Design space for graph neural networks
The rapid evolution of Graph Neural Networks (GNNs) has led to a growing number of new
architectures as well as novel applications. However, current research focuses on proposing …
architectures as well as novel applications. However, current research focuses on proposing …
NAS-FAS: Static-dynamic central difference network search for face anti-spoofing
Face anti-spoofing (FAS) plays a vital role in securing face recognition systems. Existing
methods heavily rely on the expert-designed networks, which may lead to a sub-optimal …
methods heavily rely on the expert-designed networks, which may lead to a sub-optimal …
Learning to branch for multi-task learning
Training multiple tasks jointly in one deep network yields reduced latency during inference
and better performance over the single-task counterpart by sharing certain layers of a …
and better performance over the single-task counterpart by sharing certain layers of a …
Hr-nas: Searching efficient high-resolution neural architectures with lightweight transformers
High-resolution representations (HR) are essential for dense prediction tasks such as
segmentation, detection, and pose estimation. Learning HR representations is typically …
segmentation, detection, and pose estimation. Learning HR representations is typically …
Towards fast adaptation of neural architectures with meta learning
Recently, Neural Architecture Search (NAS) has been successfully applied to multiple
artificial intelligence areas and shows better performance compared with hand-designed …
artificial intelligence areas and shows better performance compared with hand-designed …
Transnas-bench-101: Improving transferability and generalizability of cross-task neural architecture search
Abstract Recent breakthroughs of Neural Architecture Search (NAS) extend the field's
research scope towards a broader range of vision tasks and more diversified search spaces …
research scope towards a broader range of vision tasks and more diversified search spaces …
An evaluation of edge tpu accelerators for convolutional neural networks
Edge TPUs are a domain of accelerators for low-power, edge devices and are widely used
in various Google products such as Coral and Pixel devices. In this paper, we first discuss …
in various Google products such as Coral and Pixel devices. In this paper, we first discuss …