Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Gradient Algorithms for Complex Non-Gaussian Independent Component/Vector Extraction, Question of Convergence (1803.10108v2)

Published 27 Mar 2018 in eess.SP, cs.IT, and math.IT

Abstract: We revise the problem of extracting one independent component from an instantaneous linear mixture of signals. The mixing matrix is parameterized by two vectors, one column of the mixing matrix and one row of the de-mixing matrix. The separation is based on the non-Gaussianity of the source of interest, while the other background signals are assumed to be Gaussian. Three gradient-based estimation algorithms are derived using the maximum likelihood principle and are compared with the Natural Gradient algorithm for Independent Component Analysis and with One-unit FastICA based on negentropy maximization. The ideas and algorithms are also generalized for the extraction of a vector component when the extraction proceeds jointly from a set of instantaneous mixtures. Throughout the paper, we address the problem of the size of the region of convergence for which the algorithms guarantee the extraction of the desired source. We show how that size is influenced by the ratio of powers of the sources within the mixture. Simulations confirm this observation where several algorithms are compared. They show various convergence behavior in a scenario where the source of interest is dominant or weak. Here, our proposed modifications of the gradient methods taking into account the dominance/weakness of the source show improved global convergence property.

Citations (70)

Summary

We haven't generated a summary for this paper yet.