Transformer neural processes: Uncertainty-aware meta learning via sequence modeling
Neural Processes (NPs) are a popular class of approaches for meta-learning. Similar to
Gaussian Processes (GPs), NPs define distributions over functions and can estimate …
Gaussian Processes (GPs), NPs define distributions over functions and can estimate …
Np-match: When neural processes meet semi-supervised learning
Semi-supervised learning (SSL) has been widely explored in recent years, and it is an
effective way of leveraging unlabeled data to reduce the reliance on labeled data. In this …
effective way of leveraging unlabeled data to reduce the reliance on labeled data. In this …
The neural process family: Survey, applications and perspectives
The standard approaches to neural network implementation yield powerful function
approximation capabilities but are limited in their abilities to learn meta representations and …
approximation capabilities but are limited in their abilities to learn meta representations and …
Inter-domain mixup for semi-supervised domain adaptation
Semi-supervised domain adaptation (SSDA) aims to bridge source and target domain
distributions, with a small number of target labels available, achieving better classification …
distributions, with a small number of target labels available, achieving better classification …
Challenges in data-driven geospatial modeling for environmental research and practice
Abstract Machine learning-based geospatial applications offer unique opportunities for
environmental monitoring due to domains and scales adaptability and computational …
environmental monitoring due to domains and scales adaptability and computational …
Affective processes: stochastic modelling of temporal context for emotion and facial expression recognition
Temporal context is key to the recognition of expressions of emotion. Existing methods, that
rely on recurrent or self-attention models to enforce temporal consistency, work on the …
rely on recurrent or self-attention models to enforce temporal consistency, work on the …
Latent bottlenecked attentive neural processes
Neural Processes (NPs) are popular methods in meta-learning that can estimate predictive
uncertainty on target datapoints by conditioning on a context dataset. Previous state-of-the …
uncertainty on target datapoints by conditioning on a context dataset. Previous state-of-the …
Bridge the inference gaps of neural processes via expectation maximization
The neural process (NP) is a family of computationally efficient models for learning
distributions over functions. However, it suffers from under-fitting and shows suboptimal …
distributions over functions. However, it suffers from under-fitting and shows suboptimal …
Learning intrinsic and extrinsic intentions for cold-start recommendation with neural stochastic processes
User behavior data in recommendation are driven by the complex interactions of many
intentions behind the user's decision making process. However, user behavior data tends to …
intentions behind the user's decision making process. However, user behavior data tends to …
Neural processes with stochastic attention: Paying more attention to the context dataset
Neural processes (NPs) aim to stochastically complete unseen data points based on a given
context dataset. NPs essentially leverage a given dataset as a context representation to …
context dataset. NPs essentially leverage a given dataset as a context representation to …