Feedback Stabilization Methods for the Solution of Nonlinear Programming Problems (1211.1123v1)
Abstract: In this work we show that given a nonlinear programming problem, it is possible to construct a family of dynamical systems defined on the feasible set of the given problem, so that: (a) the equilibrium points are the unknown critical points of the problem, (b) each dynamical system admits the objective function of the problem as a Lyapunov function, and (c) explicit formulae are available without involving the unknown critical points of the problem. The construction of the family of dynamical systems is based on the Control Lyapunov Function methodology, which is used in mathematical control theory for the construction of stabilizing feedback. The knowledge of a dynamical system with the previously mentioned properties allows the construction of algorithms which guarantee global convergence to the set of the critical points.