A brief review of hypernetworks in deep learning

VK Chauhan, J Zhou, P Lu, S Molaei… - Artificial Intelligence …, 2024 - Springer
Hypernetworks, or hypernets for short, are neural networks that generate weights for another
neural network, known as the target network. They have emerged as a powerful deep …

[HTML][HTML] A review of uncertainty quantification in deep learning: Techniques, applications and challenges

M Abdar, F Pourpanah, S Hussain, D Rezazadegan… - Information fusion, 2021 - Elsevier
Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of
uncertainties during both optimization and decision making processes. They have been …

A survey of uncertainty in deep neural networks

J Gawlikowski, CRN Tassi, M Ali, J Lee, M Humt… - Artificial Intelligence …, 2023 - Springer
Over the last decade, neural networks have reached almost every field of science and
become a crucial part of various real world applications. Due to the increasing spread …

Uncertainty quantification in scientific machine learning: Methods, metrics, and comparisons

AF Psaros, X Meng, Z Zou, L Guo… - Journal of Computational …, 2023 - Elsevier
Neural networks (NNs) are currently changing the computational paradigm on how to
combine data with mathematical laws in physics and engineering in a profound way …

Neural architecture search: Insights from 1000 papers

C White, M Safari, R Sukthanker, B Ru, T Elsken… - arxiv preprint arxiv …, 2023 - arxiv.org
In the past decade, advances in deep learning have resulted in breakthroughs in a variety of
areas, including computer vision, natural language understanding, speech recognition, and …

Editing factual knowledge in language models

N De Cao, W Aziz, I Titov - arxiv preprint arxiv:2104.08164, 2021 - arxiv.org
The factual knowledge acquired during pre-training and stored in the parameters of
Language Models (LMs) can be useful in downstream tasks (eg, question answering or …

Superhypergraph neural networks and plithogenic graph neural networks: Theoretical foundations

T Fujita - arxiv preprint arxiv:2412.01176, 2024 - arxiv.org
Hypergraphs extend traditional graphs by allowing edges to connect multiple nodes, while
superhypergraphs further generalize this concept to represent even more complex …

Meta-learning with latent embedding optimization

AA Rusu, D Rao, J Sygnowski, O Vinyals… - arxiv preprint arxiv …, 2018 - arxiv.org
Gradient-based meta-learning techniques are both widely applicable and proficient at
solving challenging few-shot learning and fast adaptation problems. However, they have …

Permutation equivariant neural functionals

A Zhou, K Yang, K Burns, A Cardace… - Advances in neural …, 2023 - proceedings.neurips.cc
This work studies the design of neural networks that can process the weights or gradients of
other neural networks, which we refer to as neural functional networks (NFNs). Despite a …

Opportunities and obstacles for deep learning in biology and medicine

T Ching, DS Himmelstein… - Journal of the …, 2018 - royalsocietypublishing.org
Deep learning describes a class of machine learning algorithms that are capable of
combining raw inputs into layers of intermediate features. These algorithms have recently …