Your local GAN: Designing two dimensional local attention mechanisms for generative models

G Daras, A Odena, H Zhang… - Proceedings of the …, 2020 - openaccess.thecvf.com
We introduce a new local sparse attention layer that preserves two-dimensional geometry
and locality. We show that by just replacing the dense attention layer of SAGAN with our …

Patch-based stochastic attention for image editing

N Cherel, A Almansa, Y Gousseau… - Computer Vision and …, 2024 - Elsevier
Attention mechanisms have become of crucial importance in deep learning in recent years.
These non-local operations, which are similar to traditional patch-based methods in image …