Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Least Mean Square/Fourth Algorithm with Application to Sparse Channel Estimation (1304.3911v1)

Published 14 Apr 2013 in cs.IT and math.IT

Abstract: Broadband signal transmission over frequency-selective fading channel often requires accurate channel state information at receiver. One of the most attracting adaptive channel estimation methods is least mean square (LMS) algorithm. However, LMS-based method is often degraded by random scaling of input training signal. To improve the estimation performance, in this paper we apply the standard least mean square/fourth (LMS/F) algorithm to adaptive channel estimation (ACE). Since the broadband channel is often described by sparse channel model, such sparsity could be exploited as prior information. First, we propose an adaptive sparse channel estimation (ASCE) method using zero-attracting LMS/F (ZA-LMS/F) algorithm. To exploit the sparsity effectively, an improved channel estimation method is also proposed, using reweighted zero-attracting LMS/F (RZA-LMS/F) algorithm. We explain the reason why sparse LMS/F algorithms using l_1-norm sparse constraint function can improve the estimation performance by virtual of geometrical interpretation. In addition, for different channel sparsity, we propose a Monte Carlo method to select a regularization parameter for RA-LMS/F and RZA-LMS/F to achieve approximate optimal estimation performance. Finally, simulation results show that the proposed ASCE methods achieve better estimation performance than the conventional one.

Citations (3)

Summary

We haven't generated a summary for this paper yet.