Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Communication-Efficient Distributed Estimator for Generalized Linear Models with a Diverging Number of Covariates (2001.06194v2)

Published 17 Jan 2020 in stat.ME, cs.DC, cs.LG, and stat.ML

Abstract: Distributed statistical inference has recently attracted immense attention. The asymptotic efficiency of the maximum likelihood estimator (MLE), the one-step MLE, and the aggregated estimating equation estimator are established for generalized linear models under the "large $n$, diverging $p_n$" framework, where the dimension of the covariates $p_n$ grows to infinity at a polynomial rate $o(n\alpha)$ for some $0<\alpha<1$. Then a novel method is proposed to obtain an asymptotically efficient estimator for large-scale distributed data by two rounds of communication. In this novel method, the assumption on the number of servers is more relaxed and thus practical for real-world applications. Simulations and a case study demonstrate the satisfactory finite-sample performance of the proposed estimators.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Ping Zhou (116 papers)
  2. Zhen Yu (19 papers)
  3. Jingyi Ma (4 papers)
  4. Maozai Tian (5 papers)
  5. Ye Fan (11 papers)
Citations (4)

Summary

We haven't generated a summary for this paper yet.