Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Two-Level Lattice Neural Network Architectures for Control of Nonlinear Systems (2004.09628v2)

Published 20 Apr 2020 in cs.LG, cs.SY, eess.SY, math.OC, and stat.ML

Abstract: In this paper, we consider the problem of automatically designing a Rectified Linear Unit (ReLU) Neural Network (NN) architecture (number of layers and number of neurons per layer) with the guarantee that it is sufficiently parametrized to control a nonlinear system. Whereas current state-of-the-art techniques are based on hand-picked architectures or heuristic based search to find such NN architectures, our approach exploits the given model of the system to design an architecture; as a result, we provide a guarantee that the resulting NN architecture is sufficient to implement a controller that satisfies an achievable specification. Our approach exploits two basic ideas. First, assuming that the system can be controlled by an unknown Lipschitz-continuous state-feedback controller with some Lipschitz constant upper-bounded by $K_\text{cont}$, we bound the number of affine functions needed to construct a Continuous Piecewise Affine (CPWA) function that can approximate the unknown Lipschitz-continuous controller. Second, we utilize the authors' recent results on a novel NN architecture named as the Two-Level Lattice (TLL) NN architecture, which was shown to be capable of implementing any CPWA function just from the knowledge of the number of affine functions that compromises this CPWA function.

Citations (7)

Summary

We haven't generated a summary for this paper yet.