Physics-informed machine learning for modeling and control of dynamical systems

TX Nghiem, J Drgoňa, C Jones, Z Nagy… - 2023 American …, 2023 - ieeexplore.ieee.org
Physics-informed machine learning (PIML) is a set of methods and tools that systematically
integrate machine learning (ML) algorithms with physical constraints and abstract …

Exploiting connections between Lipschitz structures for certifiably robust deep equilibrium models

A Havens, A Araujo, S Garg… - Advances in Neural …, 2023 - proceedings.neurips.cc
Recently, deep equilibrium models (DEQs) have drawn increasing attention from the
machine learning community. However, DEQs are much less understood in terms of certified …

Physics-informed implicit representations of equilibrium network flows

KD Smith, F Seccamonte, A Swami… - Advances in Neural …, 2022 - proceedings.neurips.cc
Flow networks are ubiquitous in natural and engineered systems, and in order to understand
and manage these networks, one must quantify the flow of commodities across their edges …

Robust classification using contractive Hamiltonian neural ODEs

M Zakwan, L Xu… - IEEE Control Systems …, 2022 - ieeexplore.ieee.org
Deep neural networks can be fragile and sensitive to small input perturbations that might
cause a significant change in the output. In this letter, we employ contraction theory to …

Implicit graph neural networks: A monotone operator viewpoint

J Baker, Q Wang, CD Hauck… - … Conference on Machine …, 2023 - proceedings.mlr.press
Implicit graph neural networks (IGNNs)–that solve a fixed-point equilibrium equation using
Picard iteration for representation learning–have shown remarkable performance in learning …

Perspectives on contractivity in control, optimization, and learning

A Davydov, F Bullo - IEEE Control Systems Letters, 2024 - ieeexplore.ieee.org
Contraction theory is a mathematical framework for studying the convergence, robustness,
and modularity properties of dynamical systems and algorithms. In this opinion paper, we …

Euclidean contractivity of neural networks with symmetric weights

V Centorrino, A Gokhale, A Davydov… - IEEE Control …, 2023 - ieeexplore.ieee.org
This letter investigates stability conditions of continuous-time Hopfield and firing-rate neural
networks by leveraging contraction theory. First, we present a number of useful general …

Non-Euclidean contractivity of recurrent neural networks

A Davydov, AV Proskurnikov… - 2022 American Control …, 2022 - ieeexplore.ieee.org
Critical questions in dynamical neuroscience and machine learning are related to the study
of recurrent neural networks and their stability, robustness, and computational efficiency …

RNNs of RNNs: Recursive construction of stable assemblies of recurrent neural networks

L Kozachkov, M Ennis… - Advances in neural …, 2022 - proceedings.neurips.cc
Recurrent neural networks (RNNs) are widely used throughout neuroscience as models of
local neural activity. Many properties of single RNNs are well characterized theoretically, but …

The Yakubovich S-Lemma revisited: Stability and contractivity in non-Euclidean norms

AV Proskurnikov, A Davydov, F Bullo - SIAM Journal on Control and …, 2023 - SIAM
The celebrated S-Lemma was originally proposed to ensure the existence of a quadratic
Lyapunov function in the Lur'e problem of absolute stability. A quadratic Lyapunov function …