Manifold learning: What, how, and why

M Meilă, H Zhang - Annual Review of Statistics and Its …, 2024 - annualreviews.org
Manifold learning (ML), also known as nonlinear dimension reduction, is a set of methods to
find the low-dimensional structure of data. Dimension reduction for large, high-dimensional …

Understanding how dimension reduction tools work: an empirical approach to deciphering t-SNE, UMAP, TriMAP, and PaCMAP for data visualization

Y Wang, H Huang, C Rudin, Y Shaposhnik - Journal of Machine Learning …, 2021 - jmlr.org
Dimension reduction (DR) techniques such as t-SNE, UMAP, and TriMap have
demonstrated impressive visualization performance on many real-world datasets. One …

The art of using t-SNE for single-cell transcriptomics

D Kobak, P Berens - Nature communications, 2019 - nature.com
Single-cell transcriptomics yields ever growing data sets containing RNA expression levels
for thousands of genes from up to millions of cells. Common data analysis pipelines include …

openTSNE: a modular Python library for t-SNE dimensionality reduction and embedding

PG Poličar, M Stražar, B Zupan - Journal of Statistical Software, 2024 - jstatsoft.org
One of the most popular techniques for visualizing large, high-dimensional data sets is t-
distributed stochastic neighbor embedding (t-SNE). Recently, several extensions have been …

Assessing single-cell transcriptomic variability through density-preserving data visualization

A Narayan, B Berger, H Cho - Nature biotechnology, 2021 - nature.com
Nonlinear data visualization methods, such as t-distributed stochastic neighbor embedding
(t-SNE) and uniform manifold approximation and projection (UMAP), summarize the …

[KNIHA][B] Elements of dimensionality reduction and manifold learning

B Ghojogh, M Crowley, F Karray, A Ghodsi - 2023 - Springer
Dimensionality reduction, also known as manifold learning, is an area of machine learning
used for extracting informative features from data for better representation of data or …

Attraction-repulsion spectrum in neighbor embeddings

JN Böhm, P Berens, D Kobak - Journal of Machine Learning Research, 2022 - jmlr.org
Neighbor embeddings are a family of methods for visualizing complex high-dimensional
data sets using k NN graphs. To find the low-dimensional embedding, these algorithms …

Snekhorn: Dimension reduction with symmetric entropic affinities

H Van Assel, T Vayer, R Flamary… - Advances in Neural …, 2023 - proceedings.neurips.cc
Many approaches in machine learning rely on a weighted graph to encode thesimilarities
between samples in a dataset. Entropic affinities (EAs), which are notably used in the …

A probabilistic graph coupling view of dimension reduction

H Van Assel, T Espinasse… - Advances in Neural …, 2022 - proceedings.neurips.cc
Most popular dimension reduction (DR) methods like t-SNE and UMAP are based on
minimizing a cost between input and latent pairwise similarities. Though widely used, these …

From -SNE to UMAP with contrastive learning

S Damrich, JN Böhm, FA Hamprecht… - arxiv preprint arxiv …, 2022 - arxiv.org
Neighbor embedding methods $ t $-SNE and UMAP are the de facto standard for visualizing
high-dimensional datasets. Motivated from entirely different viewpoints, their loss functions …