Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Blind Channel Equalization (1208.2205v1)

Published 10 Aug 2012 in cs.IT and math.IT

Abstract: Future services demand high data rate and quality. Thus, it is necessary to define new and robust algorithms to equalize channels and reduce noise in communications. Nowadays, new equalization algorithms are being developed to optimize the channel bandwidth and reduce noise, namely, Blind Channel Equalization. Conventional equalizations minimizing mean-square error generally require a training sequence accompanying the data sequence. In this study, the result of Least Mean Square (LMS) algorithm applied on two given communication channels is analyzed. Considering the fact that blind equalizers do not require pilot signals to recover the transmitted data, implementation of four types of Constant Modulus Algorithm (CMA) for blind equalization of the channels are shown. Finally, a comparison of the simulation results of LMS and CMA for the test channels is provided.

Citations (14)

Summary

We haven't generated a summary for this paper yet.