Large language models for compiler optimization
We explore the novel application of Large Language Models to code optimization. We
present a 7B-parameter transformer model trained from scratch to optimize LLVM assembly …
present a 7B-parameter transformer model trained from scratch to optimize LLVM assembly …
Tpugraphs: A performance prediction dataset on large tensor computational graphs
Precise hardware performance models play a crucial role in code optimizations. They can
assist compilers in making heuristic decisions or aid autotuners in identifying the optimal …
assist compilers in making heuristic decisions or aid autotuners in identifying the optimal …
Meta large language model compiler: Foundation models of compiler optimization
Large Language Models (LLMs) have demonstrated remarkable capabilities across a
variety of software engineering and coding tasks. However, their application in the domain of …
variety of software engineering and coding tasks. However, their application in the domain of …
Telamalloc: Efficient on-chip memory allocation for production machine learning accelerators
Memory buffer allocation for on-chip memories is a major challenge in modern machine
learning systems that target ML accelerators. In interactive systems such as mobile phones …
learning systems that target ML accelerators. In interactive systems such as mobile phones …
Compiler generated feedback for Large Language Models
We introduce a novel paradigm in compiler optimization powered by Large Language
Models with compiler feedback to optimize the code size of LLVM assembly. The model …
Models with compiler feedback to optimize the code size of LLVM assembly. The model …
ALT: Breaking the wall between data layout and loop optimizations for deep learning compilation
Deep learning models rely on highly optimized tensor libraries for efficient inference on
heterogeneous hardware. Current deep compilers typically predetermine layouts of tensors …
heterogeneous hardware. Current deep compilers typically predetermine layouts of tensors …
Understanding LLM Embeddings for Regression
With the rise of large language models (LLMs) for flexibly processing information as strings,
a natural application is regression, specifically by preprocessing string representations into …
a natural application is regression, specifically by preprocessing string representations into …
Mlgoperf: An ml guided inliner to optimize performance
For the past 25 years, we have witnessed an extensive application of Machine Learning to
the Compiler space; the selection and the phase-ordering problem. However, limited works …
the Compiler space; the selection and the phase-ordering problem. However, limited works …
Neural architecture search using property guided synthesis
Neural architecture search (NAS) has become an increasingly important tool within the deep
learning community in recent years, yielding many practical advancements in the design of …
learning community in recent years, yielding many practical advancements in the design of …
Saturn: An Optimized Data System for Large Model Deep Learning Workloads
K Nagrecha, A Kumar - arxiv preprint arxiv:2309.01226, 2023 - arxiv.org
Large language models such as GPT-3 & ChatGPT have transformed deep learning (DL),
powering applications that have captured the public's imagination. These models are rapidly …
powering applications that have captured the public's imagination. These models are rapidly …