Bayesian statistics and modelling

R Van de Schoot, S Depaoli, R King… - Nature Reviews …, 2021 - nature.com
Bayesian statistics is an approach to data analysis based on Bayes' theorem, where
available knowledge about parameters in a statistical model is updated with the information …

Hands-on Bayesian neural networks—A tutorial for deep learning users

LV Jospin, H Laga, F Boussaid… - IEEE Computational …, 2022 - ieeexplore.ieee.org
Modern deep learning methods constitute incredibly powerful tools to tackle a myriad of
challenging problems. However, since deep learning methods operate as black boxes, the …

Graph neural networks: foundation, frontiers and applications

L Wu, P Cui, J Pei, L Zhao, X Guo - … of the 28th ACM SIGKDD conference …, 2022 - dl.acm.org
The field of graph neural networks (GNNs) has seen rapid and incredible strides over the
recent years. Graph neural networks, also known as deep learning on graphs, graph …

Uncertainty quantification in scientific machine learning: Methods, metrics, and comparisons

AF Psaros, X Meng, Z Zou, L Guo… - Journal of Computational …, 2023 - Elsevier
Neural networks (NNs) are currently changing the computational paradigm on how to
combine data with mathematical laws in physics and engineering in a profound way …

A Python library for probabilistic analysis of single-cell omics data

A Gayoso, R Lopez, G **ng, P Boyeau… - Nature …, 2022 - nature.com
To the Editor—Methods for analyzing single-cell data 1–4 perform a core set of
computational tasks. These tasks include dimensionality reduction, cell clustering, cell-state …

What are Bayesian neural network posteriors really like?

P Izmailov, S Vikram, MD Hoffman… - … on machine learning, 2021 - proceedings.mlr.press
The posterior over Bayesian neural network (BNN) parameters is extremely high-
dimensional and non-convex. For computational reasons, researchers approximate this …

Graph neural networks for natural language processing: A survey

L Wu, Y Chen, K Shen, X Guo, H Gao… - … and Trends® in …, 2023 - nowpublishers.com
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …

Learning substructure invariance for out-of-distribution molecular representations

N Yang, K Zeng, Q Wu, X Jia… - Advances in Neural …, 2022 - proceedings.neurips.cc
Molecule representation learning (MRL) has been extensively studied and current methods
have shown promising power for various tasks, eg, molecular property prediction and target …

A variational perspective on solving inverse problems with diffusion models

M Mardani, J Song, J Kautz, A Vahdat - arxiv preprint arxiv:2305.04391, 2023 - arxiv.org
Diffusion models have emerged as a key pillar of foundation models in visual domains. One
of their critical applications is to universally solve different downstream inverse tasks via a …

From word models to world models: Translating from natural language to the probabilistic language of thought

L Wong, G Grand, AK Lew, ND Goodman… - arxiv preprint arxiv …, 2023 - arxiv.org
How does language inform our downstream thinking? In particular, how do humans make
meaning from language--and how can we leverage a theory of linguistic meaning to build …