Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Information Theoretic Structure Learning with Confidence (1609.03912v1)

Published 13 Sep 2016 in cs.IT, cs.LG, math.IT, and stat.ML

Abstract: Information theoretic measures (e.g. the Kullback Liebler divergence and Shannon mutual information) have been used for exploring possibly nonlinear multivariate dependencies in high dimension. If these dependencies are assumed to follow a Markov factor graph model, this exploration process is called structure discovery. For discrete-valued samples, estimates of the information divergence over the parametric class of multinomial models lead to structure discovery methods whose mean squared error achieves parametric convergence rates as the sample size grows. However, a naive application of this method to continuous nonparametric multivariate models converges much more slowly. In this paper we introduce a new method for nonparametric structure discovery that uses weighted ensemble divergence estimators that achieve parametric convergence rates and obey an asymptotic central limit theorem that facilitates hypothesis testing and other types of statistical validation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Kevin R. Moon (31 papers)
  2. Morteza Noshad (11 papers)
  3. Salimeh Yasaei Sekeh (34 papers)
  4. Alfred O. Hero III (89 papers)
Citations (18)

Summary

We haven't generated a summary for this paper yet.