Fusemoe: Mixture-of-experts transformers for fleximodal fusion

X Han, H Nguyen, C Harris, N Ho, S Saria - arxiv preprint arxiv …, 2024 - arxiv.org
As machine learning models in critical fields increasingly grapple with multimodal data, they
face the dual challenges of handling a wide array of modalities, often incomplete due to …

Is Temperature Sample Efficient for Softmax Gaussian Mixture of Experts?

H Nguyen, P Akbarian, N Ho - arxiv preprint arxiv:2401.13875, 2024 - arxiv.org
Dense-to-sparse gating mixture of experts (MoE) has recently become an effective
alternative to a well-known sparse MoE. Rather than fixing the number of activated experts …

Bayesian likelihood free inference using mixtures of experts

HD Nguyen, TT Nguyen… - 2024 International Joint …, 2024 - ieeexplore.ieee.org
We extend Bayesian Synthetic Likelihood (BSL) methods to non-Gaussian approximations
of the likelihood function. In this setting, we introduce Mixtures of Experts (MoEs), a class of …

Bayesian nonparametric mixture of experts for inverse problems

TT Nguyen, F Forbes, J Arbel… - Journal of …, 2024 - Taylor & Francis
Large classes of problems can be formulated as inverse problems, where the goal is to find
parameter values that best explain some observed measures. The relationship between …

Bayesian Likelihood Free Inference using Mixtures of Experts

F Forbes, HD Nguyen, TT Nguyen - 2024 - hal.science
We extend Bayesian Synthetic Likelihood (BSL) methods to non-Gaussian approximations
of the likelihood function. In this setting, we introduce Mixtures of Experts (MoEs), a class of …

Bayesian nonparametric mixture of experts for high-dimensional inverse problems

T Nguyen, F Forbes, J Arbel, HD Nguyen - 2023 - hal.science
Large classes of problems can be formulated as inverse problems, where the goal is to find
parameter values that best explain some observed measures. The relationship between …