Creation and analysis of biochemical constraint-based models using the COBRA Toolbox v. 3.0

L Heirendt, S Arreckx, T Pfau, SN Mendoza… - Nature protocols, 2019 - nature.com
Constraint-based reconstruction and analysis (COBRA) provides a molecular mechanistic
framework for integrative analysis of experimental molecular systems biology data and …

A modified inexact Levenberg–Marquardt method with the descent property for solving nonlinear equations

J Yin, J Jian, G Ma - Computational Optimization and Applications, 2024 - Springer
In this work, we propose a modified inexact Levenberg–Marquardt method with the descent
property for solving nonlinear equations. A novel feature of the proposed method is that one …

A Bregman forward-backward linesearch algorithm for nonconvex composite optimization: superlinear convergence to nonisolated local minima

M Ahookhosh, A Themelis, P Patrinos - SIAM Journal on Optimization, 2021 - SIAM
We introduce Bella, a locally superlinearly convergent Bregman forward-backward splitting
method for minimizing the sum of two nonconvex functions, one of which satisfies a relative …

Explosive sound source localization in indoor and outdoor environments using modified Levenberg Marquardt algorithm

C Mahapatra, AR Mohanty - Measurement, 2022 - Elsevier
In this paper, a modified Levenberg-Marquardt algorithm (MLMA) is proposed to localize the
'point of burst'of an explosive sound source over the range of (0.5–2500) m. The objective …

A subspace inertial method for derivative-free nonlinear monotone equations

M Kimiaei, A Hassan Ibrahim, S Ghaderi - Optimization, 2023 - Taylor & Francis
We introduce a subspace inertial line search algorithm (SILSA), for finding solutions of
nonlinear monotone equations (NME). At each iteration, a new point is generated in a …

Determining the Number of Neurons in Artificial Neural Networks for Approximation, Trained with Algorithms Using the Jacobi Matrix.

K Yotov, E Hadzhikolev, S Hadzhikoleva - TEM Journal, 2020 - ceeol.com
How can we determine the optimal number of neurons when constructing an artificial neural
network? This is one of the most frequently asked questions when working with this type of …

A fast and simple modification of Newton's method avoiding saddle points

TT Truong, TD To, HT Nguyen, TH Nguyen… - Journal of Optimization …, 2023 - Springer
We propose in this paper New Q-Newton's method. The update rule is conceptually very
simple, using the projections to the vector subspaces generated by eigenvectors of positive …

Multi-step training of a generalized linear classifier

K Tyagi, M Manry - Neural Processing Letters, 2019 - Springer
We propose a multi-step training method for designing generalized linear classifiers. First,
an initial multi-class linear classifier is found through regression. Then validation error is …

Backtracking New Q-Newton's method: a good algorithm for optimization and solving systems of equations

TT Truong - arxiv preprint arxiv:2209.05378, 2022 - arxiv.org
In this paper, by combining the algorithm New Q-Newton's method-developed in previous
joint work of the author-with Armijo's Backtracking line search, we resolve convergence …

Generalized metric subregularity with applications to high-order regularized Newton methods

G Li, B Mordukhovich, J Zhu - arxiv preprint arxiv:2406.13207, 2024 - arxiv.org
This paper pursues a twofold goal. First, we introduce and study in detail a new notion of
variational analysis called generalized metric subregularity, which is a far-going extension of …