Deep time series models: A comprehensive survey and benchmark

Y Wang, H Wu, J Dong, Y Liu, M Long… - arxiv preprint arxiv …, 2024 - arxiv.org
Time series, characterized by a sequence of data points arranged in a discrete-time order,
are ubiquitous in real-world applications. Different from other modalities, time series present …

[HTML][HTML] Data-driven stock forecasting models based on neural networks: A review

W Bao, Y Cao, Y Yang, H Che, J Huang, S Wen - Information Fusion, 2024 - Elsevier
As a core branch of financial forecasting, stock forecasting plays a crucial role for financial
analysts, investors, and policymakers in managing risks and optimizing investment …

[PDF][PDF] Mamba: Linear-time sequence modeling with selective state spaces

A Gu, T Dao - arxiv preprint arxiv:2312.00752, 2023 - minjiazhang.github.io
Foundation models, now powering most of the exciting applications in deep learning, are
almost universally based on the Transformer architecture and its core attention module …

Vmamba: Visual state space model

Y Liu, Y Tian, Y Zhao, H Yu, L **e… - Advances in neural …, 2025 - proceedings.neurips.cc
Designing computationally efficient network architectures remains an ongoing necessity in
computer vision. In this paper, we adapt Mamba, a state-space language model, into …

U-mamba: Enhancing long-range dependency for biomedical image segmentation

J Ma, F Li, B Wang - arxiv preprint arxiv:2401.04722, 2024 - arxiv.org
Convolutional Neural Networks (CNNs) and Transformers have been the most popular
architectures for biomedical image segmentation, but both of them have limited ability to …

Transformers are ssms: Generalized models and efficient algorithms through structured state space duality

T Dao, A Gu - arxiv preprint arxiv:2405.21060, 2024 - arxiv.org
While Transformers have been the main architecture behind deep learning's success in
language modeling, state-space models (SSMs) such as Mamba have recently been shown …

Resurrecting recurrent neural networks for long sequences

A Orvieto, SL Smith, A Gu, A Fernando… - International …, 2023 - proceedings.mlr.press
Abstract Recurrent Neural Networks (RNNs) offer fast inference on long sequences but are
hard to optimize and slow to train. Deep state-space models (SSMs) have recently been …

Mambair: A simple baseline for image restoration with state-space model

H Guo, J Li, T Dai, Z Ouyang, X Ren, ST **a - European conference on …, 2024 - Springer
Recent years have seen significant advancements in image restoration, largely attributed to
the development of modern deep neural networks, such as CNNs and Transformers …

Hungry hungry hippos: Towards language modeling with state space models

DY Fu, T Dao, KK Saab, AW Thomas, A Rudra… - arxiv preprint arxiv …, 2022 - arxiv.org
State space models (SSMs) have demonstrated state-of-the-art sequence modeling
performance in some modalities, but underperform attention in language modeling …

Pointmamba: A simple state space model for point cloud analysis

D Liang, X Zhou, W Xu, X Zhu, Z Zou… - Advances in neural …, 2025 - proceedings.neurips.cc
Transformers have become one of the foundational architectures in point cloud analysis
tasks due to their excellent global modeling ability. However, the attention mechanism has …