Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Nonlinear Adaptive Algorithms on Rank-One Tensor Models (1610.07520v1)

Published 24 Oct 2016 in cs.SY and cs.LG

Abstract: This work proposes a low complexity nonlinearity model and develops adaptive algorithms over it. The model is based on the decomposable---or rank-one, in tensor language---Volterra kernels. It may also be described as a product of FIR filters, which explains its low-complexity. The rank-one model is also interesting because it comes from a well-posed problem in approximation theory. The paper uses such model in an estimation theory context to develop an exact gradient-type algorithm, from which adaptive algorithms such as the least mean squares (LMS) filter and its data-reuse version---the TRUE-LMS---are derived. Stability and convergence issues are addressed. The algorithms are then tested in simulations, which show its good performance when compared to other nonlinear processing algorithms in the literature.

Citations (2)

Summary

We haven't generated a summary for this paper yet.