Transformer neural processes: Uncertainty-aware meta learning via sequence modeling

T Nguyen, A Grover - arxiv preprint arxiv:2207.04179, 2022 - arxiv.org
Neural Processes (NPs) are a popular class of approaches for meta-learning. Similar to
Gaussian Processes (GPs), NPs define distributions over functions and can estimate …

Np-match: When neural processes meet semi-supervised learning

J Wang, T Lukasiewicz, D Massiceti… - International …, 2022 - proceedings.mlr.press
Semi-supervised learning (SSL) has been widely explored in recent years, and it is an
effective way of leveraging unlabeled data to reduce the reliance on labeled data. In this …

The neural process family: Survey, applications and perspectives

S Jha, D Gong, X Wang, RE Turner, L Yao - arxiv preprint arxiv …, 2022 - arxiv.org
The standard approaches to neural network implementation yield powerful function
approximation capabilities but are limited in their abilities to learn meta representations and …

Inter-domain mixup for semi-supervised domain adaptation

J Li, G Li, Y Yu - Pattern Recognition, 2024 - Elsevier
Semi-supervised domain adaptation (SSDA) aims to bridge source and target domain
distributions, with a small number of target labels available, achieving better classification …

Challenges in data-driven geospatial modeling for environmental research and practice

D Koldasbayeva, P Tregubova, M Gasanov… - Nature …, 2024 - nature.com
Abstract Machine learning-based geospatial applications offer unique opportunities for
environmental monitoring due to domains and scales adaptability and computational …

Affective processes: stochastic modelling of temporal context for emotion and facial expression recognition

E Sanchez, MK Tellamekala… - Proceedings of the …, 2021 - openaccess.thecvf.com
Temporal context is key to the recognition of expressions of emotion. Existing methods, that
rely on recurrent or self-attention models to enforce temporal consistency, work on the …

Latent bottlenecked attentive neural processes

L Feng, H Hajimirsadeghi, Y Bengio… - arxiv preprint arxiv …, 2022 - arxiv.org
Neural Processes (NPs) are popular methods in meta-learning that can estimate predictive
uncertainty on target datapoints by conditioning on a context dataset. Previous state-of-the …

Bridge the inference gaps of neural processes via expectation maximization

Q Wang, M Federici, H van Hoof - arxiv preprint arxiv:2501.03264, 2025 - arxiv.org
The neural process (NP) is a family of computationally efficient models for learning
distributions over functions. However, it suffers from under-fitting and shows suboptimal …

Learning intrinsic and extrinsic intentions for cold-start recommendation with neural stochastic processes

H Liu, L **g, D Yu, M Zhou, M Ng - Proceedings of the 30th ACM …, 2022 - dl.acm.org
User behavior data in recommendation are driven by the complex interactions of many
intentions behind the user's decision making process. However, user behavior data tends to …

Neural processes with stochastic attention: Paying more attention to the context dataset

M Kim, K Go, SY Yun - arxiv preprint arxiv:2204.05449, 2022 - arxiv.org
Neural processes (NPs) aim to stochastically complete unseen data points based on a given
context dataset. NPs essentially leverage a given dataset as a context representation to …