Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
129 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On an Empirical Likelihood based Solution to the Approximate Bayesian Computation Problem (2403.05080v1)

Published 8 Mar 2024 in stat.ME

Abstract: Approximate Bayesian Computation (ABC) methods are applicable to statistical models specified by generative processes with analytically intractable likelihoods. These methods try to approximate the posterior density of a model parameter by comparing the observed data with additional process-generated simulated datasets. For computational benefit, only the values of certain well-chosen summary statistics are usually compared, instead of the whole dataset. Most ABC procedures are computationally expensive, justified only heuristically, and have poor asymptotic properties. In this article, we introduce a new empirical likelihood-based approach to the ABC paradigm called ABCel. The proposed procedure is computationally tractable and approximates the target log posterior of the parameter as a sum of two functions of the data -- namely, the mean of the optimal log-empirical likelihood weights and the estimated differential entropy of the summary functions. We rigorously justify the procedure via direct and reverse information projections onto appropriate classes of probability densities. Past applications of empirical likelihood in ABC demanded constraints based on analytically tractable estimating functions that involve both the data and the parameter; although by the nature of the ABC problem such functions may not be available in general. In contrast, we use constraints that are functions of the summary statistics only. Equally importantly, we show that our construction directly connects to the reverse information projection. We show that ABCel is posterior consistent and has highly favourable asymptotic properties. Its construction justifies the use of simple summary statistics like moments, quantiles, etc, which in practice produce an accurate approximation of the posterior density. We illustrate the performance of the proposed procedure in a range of applications.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (65)
  1. Bayesian estimation of quantile distributions. Statistics and Computing 19(2), 189–201.
  2. Robust Bayesian synthetic likelihood via a semi-parametric approach. Stat Comput 30, 543–557.
  3. Anderson, C. W. and S. G. Coles (2002). The largest inclusions in a piece of steel. Extremes 5(3), 237–252.
  4. Andrieu, C. and G. O. Roberts (2009). The pseudo-marginal approach for efficient Monte Carlo computations. The Annals of Statistics 37(2), 697–725.
  5. Beaumont, M. A. (2003). Estimation of population growth or decline in genetically monitored populations. Genetics 164(3), 1139–1160.
  6. Adaptivity for ABC algorithms: The ABC-PMC scheme. Biometrika 96, 983–990.
  7. Approximate Bayesian computation in population genetics. Genetics 162, 2025–2035.
  8. Approximate bayesian computation with the wasserstein distance. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 81(2), 235–269.
  9. Efficient multivariate entropy estimation via k𝑘kitalic_k-nearest neighbour distances. Ann. Statist. 47(1), 288–318.
  10. A comparative review of dimension reduction methods in approximate Bayesian computation. Statistical Science 28, 189–208.
  11. Inference for stereological extremes. Journal of the American Statistical Association 102(477), 84–92.
  12. Brown, B. M. and S. X. Chen (1998). Combined and least squares empirical likelihood. Ann. Inst. Statist. Math (4), 697–714.
  13. Empirical likelihood for small area estimation. Biometrika 98, 473–480.
  14. Hamiltonian Monte Carlo sampling in Bayesian empirical likelihood. Journal of the Royal Statistical Society, Series B 79, 293–320.
  15. Elements of Information Theory. Wiley.
  16. D’Agostino, R. and M. A. Stephens (Eds.) (1986). Goodness of Fit Techniques.
  17. Marginal maximum a posteriori estimation using Markov chain Monte Carlo. Statistics and Computing 12, 77–84.
  18. Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator. Biometrika 102(2), 295–313.
  19. Drovandi, C. and D. T. Frazier (2022). A comparison of likelihood-free methods with and without summary statistics. Statistics and Computing 32(3), 42.
  20. Drovandi, C. C. and A. N. Pettitt (2011). Likelihood-free Bayesian estimation of multivariate quantile distributions. Computational Statistics & Data Analysis 55(9), 2541–2556.
  21. Bayesian indirect inference using a parametric auxiliary model. Statistical Science. 30(1), 72–95.
  22. Likelihood-free inference by penalised logistic regression. arXiv:1611.10242.
  23. Erhardt, R. and S. A. Sisson (2015). Modelling extremes using approximate Bayesian computation. In D. K. Dey and J. Yan (Eds.), Extreme Value Modelling and Risk Analysis: Methods and Applications, pp.  281–306. Chapman and Hall/CRC Press.
  24. Variational Bayesian inference for parametric and nonparametric regression with missing data. Journal of the American Statistical Association 106(495), 959–971.
  25. An extended empirical saddlepoint approximation for intractable likelihoods. arXiv:1601.01849.
  26. Constructing summary statistics for approximate Bayesian computation: Semi-automatic approximate Bayesian computation (with discussion). Journal of the Royal Statistical Society, Series B 74, 419–474.
  27. Removing Phase Transitions from Gibbs Measures. In A. Singh and J. Zhu (Eds.), Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, Volume 54 of Proceedings of Machine Learning Research, Fort Lauderdale, FL, USA, pp.  289–297. PMLR.
  28. Robust approximate Bayesian inference with synthetic likelihood. arXiv:1904.04551.
  29. Asymptotic properties of approximate Bayesian computation. Biometrika 105(3), 593–607.
  30. Bayesian inference using synthetic likelihood: asymptotics and adjustments. Journal of the American Statistical Association.
  31. Maximum likelihood estimation under constraints: Singularities and random critical points. IEEE Transactions on Information Theory 69(12), 7976–7997.
  32. Simulation-based Econometric Methods. Oxford, United Kingdom: Oxford University Press.
  33. Approximate Bayesian Inference in Semiparametric Copula Models. Bayesian Analysis 12(4), 991 – 1016.
  34. Gut, A. (2002). On the moment problem. Bernoulli 8(3), 407 – 421.
  35. An adaptive Metropolis algorithm. Bernoulli 7(2), 223 – 242.
  36. On the estimation of entropy. Annals of Institute of Statistical Mathematics 45, 69–88.
  37. Robustness of ranking and selection rules using generalised g-and- k distributions. Journal of Statistical Planning and Inference 65(1), 45–66.
  38. Reducing degeneracy in maximum entropy models of networks. Phys. Rev. Lett. 114, 158701.
  39. Information theory and statistical mechanics. Phys. Rev. 106, 620–630.
  40. Information theory and statistical mechanics. ii. Phys. Rev. 108, 171–190.
  41. Kozachenko, L. F. and N. N. Leonenko (1987). Sample estimate of the entropy of a random vector. Probl. Peredachi Inf., 9–16.
  42. Lazar, N. A. (2003). Bayesian empirical likelihood. Biometrika 90, 319–326.
  43. Data cloning: easy maximum likelihood estimation for complex ecological models using Bayesian Markov chain Monte Carlo methods. Ecology Letters 10, 551–563.
  44. Convergence of regression-adjusted approximate Bayesian computation. Biometrika 105(2), 301–318.
  45. On the asymptotic efficiency of approximate Bayesian computation estimators. Biometrika 105(2), 285–299.
  46. Approximate Bayesian computational methods. Statistics and Computing 21, 289–291.
  47. Markov chain Monte Carlo without likelihoods. Proceedings of the National Academy of Sciences of the USA 100, 15324–15328.
  48. Bayesian computation via empirical likelihood. Proceedings of the National Academy of Sciences 110(4), 1321–1326.
  49. Miller, J. W. and D. B. Dunson (2019). Robust bayesian inference via coarsening. Journal of the American Statistical Association 114(527), 1113–1125. PMID: 31942084.
  50. Ormerod, J. T. and M. P. Wand (2010). Explaining variational approximation. The American Statistics 64(2), 140–153.
  51. Owen, A. B. (2001). Empirical Likelihood. London: Chapman and Hall.
  52. Undersmoothed kernel entropy estimators. IEEE Transactions on Information Theory 54(9), 4384–4388.
  53. Bayesian inference, Monte Carlo sampling and operational risk. Journal of Operational Risk 1(3), 27–50.
  54. A note on approximating ABC-MCMC using flexible classifiers. Stat 3(1), 218–227.
  55. Bayesian synthetic likelihood. Journal of Computational and Graphical Statistics 27(1), 1–11.
  56. Transformations in semi-parametric Bayesian synthetic likelihood. arxiv:2007.01485.
  57. Robert, C. P. (2016). Approximate bayesian computation: A survey on recent results. In R. Cools and D. Nuyens (Eds.), Monte Carlo and Quasi-Monte Carlo Methods, Cham, pp.  185–205. Springer International Publishing.
  58. Rubin, D. B. (1984). Bayesianly Justifiable and Relevant Frequency Calculations for the Applied Statistician. The Annals of Statistics 12(4), 1151 – 1172.
  59. Bayesian exponentially tilted empirical likelihood. Biometrika 92(1), 31–46.
  60. Sequential Monte Carlo without likelihoods. Proceedings of the National Academy of Sciences of the USA 104, 1760–1765. Errata (2009), 106, 16889.
  61. Inferring coalescence times from DNA sequence data. Genetics 145, 505–518.
  62. Tsybakov, A. B. and E. C. van der Meulen (1996). Root-n consistent estimators of entropy for densities with unbounded support. Scandinavian Journal of Statistics 23(1), 75–83.
  63. Whittaker, J. (1990). Graphical Models in Applied Multivariate Statistics. Wiley.
  64. Wood, S. N. (2010). Statistical inference for noisy nonlinear ecological dynamic systems. Nature 466(7310), 1102–1104.
  65. emplik: Empirical Likelihood Ratio for Censored/Truncated Data. R package version 1.0-3.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com