Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
173 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Univariate ReLU neural network and its application in nonlinear system identification (2003.02666v1)

Published 4 Mar 2020 in eess.SY, cs.SY, and eess.SP

Abstract: ReLU (rectified linear units) neural network has received significant attention since its emergence. In this paper, a univariate ReLU (UReLU) neural network is proposed to both modelling the nonlinear dynamic system and revealing insights about the system. Specifically, the neural network consists of neurons with linear and UReLU activation functions, and the UReLU functions are defined as the ReLU functions respect to each dimension. The UReLU neural network is a single hidden layer neural network, and the structure is relatively simple. The initialization of the neural network employs the decoupling method, which provides a good initialization and some insight into the nonlinear system. Compared with normal ReLU neural network, the number of parameters of UReLU network is less, but it still provide a good approximation of the nonlinear dynamic system. The performance of the UReLU neural network is shown through a Hysteretic benchmark system: the Bouc-Wen system. Simulation results verify the effectiveness of the proposed method.

Citations (1)

Summary

We haven't generated a summary for this paper yet.