Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
A survey of contextual optimization methods for decision-making under uncertainty
Recently there has been a surge of interest in operations research (OR) and the machine
learning (ML) community in combining prediction algorithms and optimization techniques to …
learning (ML) community in combining prediction algorithms and optimization techniques to …
The elements of differentiable programming
Artificial intelligence has recently experienced remarkable advances, fueled by large
models, vast datasets, accelerated hardware, and, last but not least, the transformative …
models, vast datasets, accelerated hardware, and, last but not least, the transformative …
On neural differential equations
P Kidger - arxiv preprint arxiv:2202.02435, 2022 - arxiv.org
The conjoining of dynamical systems and deep learning has become a topic of great
interest. In particular, neural differential equations (NDEs) demonstrate that neural networks …
interest. In particular, neural differential equations (NDEs) demonstrate that neural networks …
Craft: Concept recursive activation factorization for explainability
Attribution methods are a popular class of explainability methods that use heatmaps to
depict the most important areas of an image that drive a model decision. Nevertheless …
depict the most important areas of an image that drive a model decision. Nevertheless …
Theseus: A library for differentiable nonlinear optimization
We present Theseus, an efficient application-agnostic open source library for differentiable
nonlinear least squares (DNLS) optimization built on PyTorch, providing a common …
nonlinear least squares (DNLS) optimization built on PyTorch, providing a common …
A graph-based methodology for constructing computational models that automates adjoint-based sensitivity analysis
The adjoint method provides an efficient way to compute sensitivities for system models with
a large number of inputs. However, implementing the adjoint method requires significant …
a large number of inputs. However, implementing the adjoint method requires significant …
Synergies between disentanglement and sparsity: Generalization and identifiability in multi-task learning
Although disentangled representations are often said to be beneficial for downstream tasks,
current empirical and theoretical understanding is limited. In this work, we provide evidence …
current empirical and theoretical understanding is limited. In this work, we provide evidence …
Linear adversarial concept erasure
Modern neural models trained on textual data rely on pre-trained representations that
emerge without direct supervision. As these representations are increasingly being used in …
emerge without direct supervision. As these representations are increasingly being used in …
A-nesi: A scalable approximate method for probabilistic neurosymbolic inference
We study the problem of combining neural networks with symbolic reasoning. Recently
introduced frameworks for Probabilistic Neurosymbolic Learning (PNL), such as …
introduced frameworks for Probabilistic Neurosymbolic Learning (PNL), such as …
Sinkformers: Transformers with doubly stochastic attention
Attention based models such as Transformers involve pairwise interactions between data
points, modeled with a learnable attention matrix. Importantly, this attention matrix is …
points, modeled with a learnable attention matrix. Importantly, this attention matrix is …