Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Deep time series models: A comprehensive survey and benchmark
Time series, characterized by a sequence of data points arranged in a discrete-time order,
are ubiquitous in real-world applications. Different from other modalities, time series present …
are ubiquitous in real-world applications. Different from other modalities, time series present …
[HTML][HTML] Data-driven stock forecasting models based on neural networks: A review
As a core branch of financial forecasting, stock forecasting plays a crucial role for financial
analysts, investors, and policymakers in managing risks and optimizing investment …
analysts, investors, and policymakers in managing risks and optimizing investment …
[PDF][PDF] Mamba: Linear-time sequence modeling with selective state spaces
Foundation models, now powering most of the exciting applications in deep learning, are
almost universally based on the Transformer architecture and its core attention module …
almost universally based on the Transformer architecture and its core attention module …
Vmamba: Visual state space model
Designing computationally efficient network architectures remains an ongoing necessity in
computer vision. In this paper, we adapt Mamba, a state-space language model, into …
computer vision. In this paper, we adapt Mamba, a state-space language model, into …
U-mamba: Enhancing long-range dependency for biomedical image segmentation
Convolutional Neural Networks (CNNs) and Transformers have been the most popular
architectures for biomedical image segmentation, but both of them have limited ability to …
architectures for biomedical image segmentation, but both of them have limited ability to …
Transformers are ssms: Generalized models and efficient algorithms through structured state space duality
While Transformers have been the main architecture behind deep learning's success in
language modeling, state-space models (SSMs) such as Mamba have recently been shown …
language modeling, state-space models (SSMs) such as Mamba have recently been shown …
Resurrecting recurrent neural networks for long sequences
Abstract Recurrent Neural Networks (RNNs) offer fast inference on long sequences but are
hard to optimize and slow to train. Deep state-space models (SSMs) have recently been …
hard to optimize and slow to train. Deep state-space models (SSMs) have recently been …
Mambair: A simple baseline for image restoration with state-space model
Recent years have seen significant advancements in image restoration, largely attributed to
the development of modern deep neural networks, such as CNNs and Transformers …
the development of modern deep neural networks, such as CNNs and Transformers …
Hungry hungry hippos: Towards language modeling with state space models
State space models (SSMs) have demonstrated state-of-the-art sequence modeling
performance in some modalities, but underperform attention in language modeling …
performance in some modalities, but underperform attention in language modeling …
Pointmamba: A simple state space model for point cloud analysis
Transformers have become one of the foundational architectures in point cloud analysis
tasks due to their excellent global modeling ability. However, the attention mechanism has …
tasks due to their excellent global modeling ability. However, the attention mechanism has …