Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
Simplified state space layers for sequence modeling
Models using structured state space sequence (S4) layers have achieved state-of-the-art
performance on long-range sequence modeling tasks. An S4 layer combines linear state …
performance on long-range sequence modeling tasks. An S4 layer combines linear state …
Monarch mixer: A simple sub-quadratic gemm-based architecture
Abstract Machine learning models are increasingly being scaled in both sequence length
and model dimension to reach longer contexts and better performance. However, existing …
and model dimension to reach longer contexts and better performance. However, existing …
Mega: moving average equipped gated attention
The design choices in the Transformer attention mechanism, including weak inductive bias
and quadratic computational complexity, have limited its application for modeling long …
and quadratic computational complexity, have limited its application for modeling long …
More convnets in the 2020s: Scaling up kernels beyond 51x51 using sparsity
Transformers have quickly shined in the computer vision world since the emergence of
Vision Transformers (ViTs). The dominant role of convolutional neural networks (CNNs) …
Vision Transformers (ViTs). The dominant role of convolutional neural networks (CNNs) …
Towards multi-spatiotemporal-scale generalized pde modeling
Partial differential equations (PDEs) are central to describing complex physical system
simulations. Their expensive solution techniques have led to an increased interest in deep …
simulations. Their expensive solution techniques have led to an increased interest in deep …
Convolutional networks with oriented 1d kernels
In computer vision, 2D convolution is arguably the most important operation performed by a
ConvNet. Unsurprisingly, it has been the focus of intense software and hardware …
ConvNet. Unsurprisingly, it has been the focus of intense software and hardware …
Learning long sequences in spiking neural networks
Spiking neural networks (SNNs) take inspiration from the brain to enable energy-efficient
computations. Since the advent of Transformers, SNNs have struggled to compete with …
computations. Since the advent of Transformers, SNNs have struggled to compete with …
Transformers significantly improve splice site prediction
BA Jónsson, GH Halldórsson, S Árdal… - Communications …, 2024 - nature.com
Mutations that affect RNA splicing significantly impact human diversity and disease. Here we
present a method using transformers, a type of machine learning model, to detect splicing …
present a method using transformers, a type of machine learning model, to detect splicing …
QuadConv: Quadrature-based convolutions with applications to non-uniform PDE data compression
We present a new convolution layer for deep learning architectures which we call
QuadConv—an approximation to continuous convolution via quadrature. Our operator is …
QuadConv—an approximation to continuous convolution via quadrature. Our operator is …
Dnarch: Learning convolutional neural architectures by backpropagation
We present Differentiable Neural Architectures (DNArch), a method that jointly learns the
weights and the architecture of Convolutional Neural Networks (CNNs) by backpropagation …
weights and the architecture of Convolutional Neural Networks (CNNs) by backpropagation …