Turnitin
降AI改写
早检测系统
早降重系统
Turnitin-UK版
万方检测-期刊版
维普编辑部版
Grammarly检测
Paperpass检测
checkpass检测
PaperYY检测
(De/Re)-Composition of Data-Parallel Computations via Multi-Dimensional Homomorphisms
A Rasch - ACM Transactions on Programming Languages and …, 2024 - dl.acm.org
Data-parallel computations, such as linear algebra routines and stencil computations,
constitute one of the most relevant classes in parallel computing, eg, due to their importance …
constitute one of the most relevant classes in parallel computing, eg, due to their importance …
Full Version:(De/Re)-Composition of Data-Parallel Computations via Multi-Dimensional Homomorphisms
A Rasch - arxiv preprint arxiv:2405.05118, 2024 - arxiv.org
We formally introduce a systematic (de/re)-composition approach, based on the algebraic
formalism of" Multi-Dimensional Homomorphisms (MDHs)". Our approach is designed as …
formalism of" Multi-Dimensional Homomorphisms (MDHs)". Our approach is designed as …
mlirSynth: Automatic, Retargetable Program Raising in Multi-Level IR using Program Synthesis
MLIR is an emerging compiler infrastructure for modern hardware, but existing programs
cannot take advantage of MLIR's high-performance compilation if they are described in …
cannot take advantage of MLIR's high-performance compilation if they are described in …
Compiling Recurrences over Dense and Sparse Arrays
We present a framework for compiling recurrence equations into native code. In our
framework, users specify a system of recurrences, the types of data structures that store …
framework, users specify a system of recurrences, the types of data structures that store …
(De/Re)-Compositions Expressed Systematically via MDH-Based Schedules
We introduce a new scheduling language, based on the formalism of Multi-Dimensional
Homomorphisms (MDH). In contrast to existing scheduling languages, our MDH-based …
Homomorphisms (MDH). In contrast to existing scheduling languages, our MDH-based …
Incremental Computation: What Is the Essence?
YA Liu - arxiv preprint arxiv:2312.07946, 2023 - arxiv.org
Incremental computation aims to compute more efficiently on changed input by reusing
previously computed results. We give a high-level overview of works on incremental …
previously computed results. We give a high-level overview of works on incremental …
Parallelizing neural network models effectively on gpu by implementing reductions atomically
Due to the missing of a good orchestration of loop transformations, existing optimizing
compilers for deploying neural networks on GPU either parallelize reductions ineffectively or …
compilers for deploying neural networks on GPU either parallelize reductions ineffectively or …
Simplification of Polyhedral Reductions in Practice
Reductions combine collections of inputs with an associative (and here, also commutative)
operator to produce collections of outputs. When the same value contributes to multiple …
operator to produce collections of outputs. When the same value contributes to multiple …
Maximal Simplification of Polyhedral Reductions
Reductions combine collections of input values with an associative and often commutative
operator to produce collections of results. When the same input value contributes to multiple …
operator to produce collections of results. When the same input value contributes to multiple …
Incremental Computation: What Is the Essence?(Invited Contribution)
YA Liu - Proceedings of the 2024 ACM SIGPLAN International …, 2024 - dl.acm.org
Incremental computation aims to compute more efficiently on changed input by reusing
previously computed results. We give a high-level overview of works on incremental …
previously computed results. We give a high-level overview of works on incremental …