Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
169 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Rates of convergence and normal approximations for estimators of local dependence random graph models (2404.11464v2)

Published 17 Apr 2024 in math.ST and stat.TH

Abstract: Local dependence random graph models are a class of block models for network data which allow for dependence among edges under a local dependence assumption defined around the block structure of the network. Since being introduced by Schweinberger and Handcock (2015), research in the statistical network analysis and network science literatures have demonstrated the potential and utility of this class of models. In this work, we provide the first theory for estimation and inference which ensures consistent and valid inference of parameter vectors of local dependence random graph models. This is accomplished by deriving convergence rates of estimation and inference procedures for local dependence random graph models based on a single observation of the graph, allowing both the number of model parameters and the sizes of blocks to tend to infinity. First, we derive non-asymptotic bounds on the $\ell_2$-error of maximum likelihood estimators with convergence rates, outlining conditions under which these rates are minimax optimal. Second, and more importantly, we derive non-asymptotic bounds on the error of the multivariate normal approximation. These theoretical results are the first to achieve both optimal rates of convergence and non-asymptotic bounds on the error of the multivariate normal approximation for parameter vectors of local dependence random graph models.

Summary

We haven't generated a summary for this paper yet.