Geometry processing with neural fields

G Yang, S Belongie, B Hariharan… - Advances in Neural …, 2021 - proceedings.neurips.cc
Most existing geometry processing algorithms use meshes as the default shape
representation. Manipulating meshes, however, requires one to maintain high quality in the …

Polynomial neural fields for subband decomposition and manipulation

G Yang, S Benaim, V Jampani… - Advances in …, 2022 - proceedings.neurips.cc
Neural fields have emerged as a new paradigm for representing signals, thanks to their
ability to do it compactly while being easy to optimize. In most applications, however, neural …

Regularization of polynomial networks for image recognition

GG Chrysos, B Wang, J Deng… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract Deep Neural Networks (DNNs) have obtained impressive performance across
tasks, however they still remain as black boxes, eg, hard to theoretically analyze. At the …

Neural Redshift: Random Networks are not Random Functions

D Teney, AM Nicolicioiu, V Hartmann… - Proceedings of the …, 2024 - openaccess.thecvf.com
Our understanding of the generalization capabilities of neural networks NNs is still
incomplete. Prevailing explanations are based on implicit biases of gradient descent GD but …

Extrapolation and spectral bias of neural nets with hadamard product: a polynomial net study

Y Wu, Z Zhu, F Liu, G Chrysos… - Advances in neural …, 2022 - proceedings.neurips.cc
Neural tangent kernel (NTK) is a powerful tool to analyze training dynamics of neural
networks and their generalization bounds. The study on NTK has been devoted to typical …

Quadratic residual multiplicative filter neural networks for efficient approximation of complex sensor signals

MU Demirezen - IEEE Access, 2023 - ieeexplore.ieee.org
In this research, we present an innovative Quadratic Residual Multiplicative Filter Neural
Network (QRMFNN) to effectively learn extremely complex sensor signals as a low …

Linear Complexity Self-Attention With Order Polynomials

F Babiloni, I Marras, J Deng, F Kokkinos… - … on Pattern Analysis …, 2023 - ieeexplore.ieee.org
Self-attention mechanisms and non-local blocks have become crucial building blocks for
state-of-the-art neural architectures thanks to their unparalleled ability in capturing long …

Deep relu networks have surprisingly simple polytopes

FL Fan, W Huang, X Zhong, L Ruan, T Zeng… - arxiv preprint arxiv …, 2023 - arxiv.org
A ReLU network is a piecewise linear function over polytopes. Figuring out the properties of
such polytopes is of fundamental importance for the research and development of neural …

Sound and complete verification of polynomial networks

E Abad Rocamora, MF Sahin, F Liu… - Advances in …, 2022 - proceedings.neurips.cc
Abstract Polynomial Networks (PNs) have demonstrated promising performance on face and
image recognition recently. However, robustness of PNs is unclear and thus obtaining …

Random Polynomial Neural Networks: Analysis and Design

W Huang, Y **ao, SK Oh, W Pedrycz… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
In this article, we propose the concept of random polynomial neural networks (RPNNs)
realized based on the architecture of polynomial neural networks (PNNs) with random …