Geometry processing with neural fields
Most existing geometry processing algorithms use meshes as the default shape
representation. Manipulating meshes, however, requires one to maintain high quality in the …
representation. Manipulating meshes, however, requires one to maintain high quality in the …
Polynomial neural fields for subband decomposition and manipulation
Neural fields have emerged as a new paradigm for representing signals, thanks to their
ability to do it compactly while being easy to optimize. In most applications, however, neural …
ability to do it compactly while being easy to optimize. In most applications, however, neural …
Regularization of polynomial networks for image recognition
Abstract Deep Neural Networks (DNNs) have obtained impressive performance across
tasks, however they still remain as black boxes, eg, hard to theoretically analyze. At the …
tasks, however they still remain as black boxes, eg, hard to theoretically analyze. At the …
Neural Redshift: Random Networks are not Random Functions
Our understanding of the generalization capabilities of neural networks NNs is still
incomplete. Prevailing explanations are based on implicit biases of gradient descent GD but …
incomplete. Prevailing explanations are based on implicit biases of gradient descent GD but …
Extrapolation and spectral bias of neural nets with hadamard product: a polynomial net study
Neural tangent kernel (NTK) is a powerful tool to analyze training dynamics of neural
networks and their generalization bounds. The study on NTK has been devoted to typical …
networks and their generalization bounds. The study on NTK has been devoted to typical …
Quadratic residual multiplicative filter neural networks for efficient approximation of complex sensor signals
MU Demirezen - IEEE Access, 2023 - ieeexplore.ieee.org
In this research, we present an innovative Quadratic Residual Multiplicative Filter Neural
Network (QRMFNN) to effectively learn extremely complex sensor signals as a low …
Network (QRMFNN) to effectively learn extremely complex sensor signals as a low …
Linear Complexity Self-Attention With Order Polynomials
Self-attention mechanisms and non-local blocks have become crucial building blocks for
state-of-the-art neural architectures thanks to their unparalleled ability in capturing long …
state-of-the-art neural architectures thanks to their unparalleled ability in capturing long …
Deep relu networks have surprisingly simple polytopes
A ReLU network is a piecewise linear function over polytopes. Figuring out the properties of
such polytopes is of fundamental importance for the research and development of neural …
such polytopes is of fundamental importance for the research and development of neural …
Sound and complete verification of polynomial networks
Abstract Polynomial Networks (PNs) have demonstrated promising performance on face and
image recognition recently. However, robustness of PNs is unclear and thus obtaining …
image recognition recently. However, robustness of PNs is unclear and thus obtaining …
Random Polynomial Neural Networks: Analysis and Design
In this article, we propose the concept of random polynomial neural networks (RPNNs)
realized based on the architecture of polynomial neural networks (PNNs) with random …
realized based on the architecture of polynomial neural networks (PNNs) with random …