Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Variable Earns Profit: Improved Adaptive Channel Estimation using Sparse VSS-NLMS Algorithms (1311.1315v1)

Published 6 Nov 2013 in cs.IT and math.IT

Abstract: Accurate channel estimation is essential for broadband wireless communications. As wireless channels often exhibit sparse structure, the adaptive sparse channel estimation algorithms based on normalized least mean square (NLMS) have been proposed, e.g., the zero-attracting NLMS (ZA-NLMS) algorithm and reweighted zero-attracting NLMS (RZA-NLMS). In these NLMS-based algorithms, the step size used to iteratively update the channel estimate is a critical parameter to control the estimation accuracy and the convergence speed (so the computational cost). However, invariable step-size (ISS) is usually used in conventional algorithms, which leads to provide performance loss or/and low convergence speed as well as high computational cost. To solve these problems, based on the observation that large step size is preferred for fast convergence while small step size is preferred for accurate estimation, we propose to replace the ISS by variable step size (VSS) in conventional NLMS-based algorithms to improve the adaptive sparse channel estimation in terms of bit error rate (BER) and mean square error (MSE) metrics. The proposed VSS-ZA-NLMS and VSS-RZA-NLMS algorithms adopt VSS, which can be adaptive to the estimation error in each iteration, i.e., large step size is used in the case of large estimation error to accelerate the convergence speed, while small step size is used when the estimation error is small to improve the steady-state estimation accuracy. Simulation results are provided to validate the effectiveness of the proposed scheme.

Citations (25)

Summary

We haven't generated a summary for this paper yet.