Flexible diffusion modeling of long videos

W Harvey, S Naderiparizi, V Masrani… - Advances in …, 2022 - proceedings.neurips.cc
We present a framework for video modeling based on denoising diffusion probabilistic
models that produces long-duration video completions in a variety of realistic environments …

Dataset distillation using neural feature regression

Y Zhou, E Nezhadarya, J Ba - Advances in Neural …, 2022 - proceedings.neurips.cc
Dataset distillation aims to learn a small synthetic dataset that preserves most of the
information from the original dataset. Dataset distillation can be formulated as a bi-level …

Brax--a differentiable physics engine for large scale rigid body simulation

CD Freeman, E Frey, A Raichuk, S Girgin… - arxiv preprint arxiv …, 2021 - arxiv.org
We present Brax, an open source library for rigid body simulation with a focus on
performance and parallelism on accelerators, written in JAX. We present results on a suite of …

[ΒΙΒΛΙΟ][B] Dive into deep learning

A Zhang, ZC Lipton, M Li, AJ Smola - 2023 - books.google.com
Deep learning has revolutionized pattern recognition, introducing tools that power a wide
range of technologies in such diverse fields as computer vision, natural language …

Theseus: A library for differentiable nonlinear optimization

L Pineda, T Fan, M Monge… - Advances in …, 2022 - proceedings.neurips.cc
We present Theseus, an efficient application-agnostic open source library for differentiable
nonlinear least squares (DNLS) optimization built on PyTorch, providing a common …

Learning discrete structures for graph neural networks

L Franceschi, M Niepert, M Pontil… - … conference on machine …, 2019 - proceedings.mlr.press
Graph neural networks (GNNs) are a popular class of machine learning models that have
been successfully applied to a range of problems. Their major advantage lies in their ability …

Residual flows for invertible generative modeling

RTQ Chen, J Behrmann… - Advances in Neural …, 2019 - proceedings.neurips.cc
Flow-based generative models parameterize probability distributions through an invertible
transformation and can be trained by maximum likelihood. Invertible residual networks …

Regularizing and optimizing LSTM language models

S Merity, NS Keskar, R Socher - arxiv preprint arxiv:1708.02182, 2017 - arxiv.org
Recurrent neural networks (RNNs), such as long short-term memory networks (LSTMs),
serve as a fundamental building block for many sequence learning tasks, including machine …

Character-level language modeling with deeper self-attention

R Al-Rfou, D Choe, N Constant, M Guo… - Proceedings of the AAAI …, 2019 - ojs.aaai.org
LSTMs and other RNN variants have shown strong performance on character-level
language modeling. These models are typically trained using truncated backpropagation …

Aligning text-to-image diffusion models with reward backpropagation

M Prabhudesai, A Goyal, D Pathak, K Fragkiadaki - 2023 - openreview.net
Text-to-image diffusion models have recently emerged at the forefront of image generation,
powered by very large-scale unsupervised or weakly supervised text-to-image training …