Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Diffusion LMS for clustered multitask networks (1310.8615v1)

Published 31 Oct 2013 in cs.SY, cs.IT, cs.MA, and math.IT

Abstract: Recent research works on distributed adaptive networks have intensively studied the case where the nodes estimate a common parameter vector collaboratively. However, there are many applications that are multitask-oriented in the sense that there are multiple parameter vectors that need to be inferred simultaneously. In this paper, we employ diffusion strategies to develop distributed algorithms that address clustered multitask problems by minimizing an appropriate mean-square error criterion with $\ell_2$-regularization. Some results on the mean-square stability and convergence of the algorithm are also provided. Simulations are conducted to illustrate the theoretical findings.

Citations (29)

Summary

We haven't generated a summary for this paper yet.