Recent advances in directional statistics

A Pewsey, E García-Portugués - Test, 2021 - Springer
Mainstream statistical methodology is generally applicable to data observed in Euclidean
space. There are, however, numerous contexts of considerable scientific interest in which …

Unsupervised grouped axial data modeling via hierarchical Bayesian nonparametric models with Watson distributions

W Fan, L Yang, N Bouguila - IEEE Transactions on Pattern …, 2021 - ieeexplore.ieee.org
This paper aims at proposing an unsupervised hierarchical nonparametric Bayesian
framework for modeling axial data (ie, observations are axes of direction) that can be …

Finite mixture modeling in time series: A survey of Bayesian filters and fusion approaches

T Li, H Liang, B **ao, Q Pan, Y He - Information Fusion, 2023 - Elsevier
From the celebrated Gaussian mixture, model averaging estimators to the cutting-edge multi-
Bernoulli mixture of various forms, finite mixture models offer a fundamental and flexible …

Improving deep neural networks with multi-layer maxout networks and a novel initialization method

W Sun, F Su, L Wang - Neurocomputing, 2018 - Elsevier
For the purpose of enhancing the discriminability of convolutional neural networks (CNNs)
and facilitating the optimization, we investigate the activation function for a neural network …

Variational Bayesian learning for Dirichlet process mixture of inverted Dirichlet distributions in non-Gaussian image feature modeling

Z Ma, Y Lai, WB Kleijn, YZ Song… - IEEE transactions on …, 2018 - ieeexplore.ieee.org
In this paper, we develop a novel variational Bayesian learning method for the Dirichlet
process (DP) mixture of the inverted Dirichlet distributions, which has been shown to be very …

Decorrelation of neutral vector variables: Theory and applications

Z Ma, JH Xue, A Leijon, ZH Tan… - IEEE transactions on …, 2016 - ieeexplore.ieee.org
In this paper, we propose novel strategies for neutral vector variable decorrelation. Two
fundamental invertible transformations, namely, serial nonlinear transformation and parallel …

Deep clustering analysis via dual variational autoencoder with spherical latent embeddings

L Yang, W Fan, N Bouguila - IEEE Transactions on Neural …, 2021 - ieeexplore.ieee.org
In recent years, clustering methods based on deep generative models have received great
attention in various unsupervised applications, due to their capabilities for learning …

Insights into multiple/single lower bound approximation for extended variational inference in non-Gaussian structured data modeling

Z Ma, J **e, Y Lai, J Taghia, JH Xue… - IEEE Transactions on …, 2019 - ieeexplore.ieee.org
For most of the non-Gaussian statistical models, the data being modeled represent strongly
structured properties, such as scalar data with bounded support (eg, beta distribution) …

Dino as a von mises-fisher mixture model

H Govindarajan, P Sidén, J Roll, F Lindsten - arxiv preprint arxiv …, 2024 - arxiv.org
Self-distillation methods using Siamese networks are popular for self-supervised pre-
training. DINO is one such method based on a cross-entropy loss between $ K …

Unsupervised meta-learning via spherical latent representations and dual VAE-GAN

W Fan, H Huang, C Liang, X Liu, SJ Peng - Applied Intelligence, 2023 - Springer
Unsupervised learning and meta-learning share a common goal of enhancing learning
efficiency compared to starting from scratch. However, meta-learning methods are …