Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neural Bayes Estimators for Irregular Spatial Data using Graph Neural Networks (2310.02600v3)

Published 4 Oct 2023 in stat.ME and stat.ML

Abstract: Neural Bayes estimators are neural networks that approximate Bayes estimators in a fast and likelihood-free manner. Although they are appealing to use with spatial models, where estimation is often a computational bottleneck, neural Bayes estimators in spatial applications have, to date, been restricted to data collected over a regular grid. These estimators are also currently dependent on a prescribed set of spatial locations, which means that the neural network needs to be re-trained for new data sets; this renders them impractical in many applications and impedes their widespread adoption. In this work, we employ graph neural networks to tackle the important problem of parameter point estimation from data collected over arbitrary spatial locations. In addition to extending neural Bayes estimation to irregular spatial data, our architecture leads to substantial computational benefits, since the estimator can be used with any configuration or number of locations and independent replicates, thus amortising the cost of training for a given spatial model. We also facilitate fast uncertainty quantification by training an accompanying neural Bayes estimator that approximates a set of marginal posterior quantiles. We illustrate our methodology on Gaussian and max-stable processes. Finally, we showcase our methodology on a data set of global sea-surface temperature, where we estimate the parameters of a Gaussian process model in 2161 spatial regions, each containing thousands of irregularly-spaced data points, in just a few minutes with a single graphics processing unit.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (63)
  1. A valid Matérn class of cross-covariance functions for multivariate random fields with any number of components. Journal of the American Statistical Association, 497:180–193.
  2. Spatial Point Patterns: Methodology and Applications with R. Chapman & Hall/CRC, Boca Raton, FL.
  3. Fast Gaussian process estimation for large-scale in situ inference using convolutional neural networks. In IEEE International Conference on Big Data, pages 3731–3739.
  4. Estimating space and space-time covariance functions for large data sets: A weighted composite likelihood approach. Journal of the American Statistical Association, 107:268–280.
  5. Extreme values of independent stochastic processes. Journal of Applied Probability, 14:732–739.
  6. Suomi NPP VIIRS sensor data record verification, validation, and long-term performance monitoring. Journal of Geophysical Research: Atmospheres, 118:11–664.
  7. Local likelihood estimation of complex tail dependence structures, applied to U.S. precipitation extremes. Journal of the American Statistical Association, 115:1037–1054.
  8. High-order composite likelihood inference for max-stable distributions and processes. Journal of Computational and Graphical Statistics, 25:1212–1229.
  9. Neural approximate sufficient statistics for implicit models. In Proceedings of the 9th International Conference on Learning Representations, ICLR.
  10. Deep graphical regression for jointly moderate and extreme Australian wildfires. arXiv:2308.14547v1.
  11. Creel, M. (2017). Neural nets for indirect inference. Econometrics and Statistics, 2:36–49.
  12. Cressie, N. (2023). Decisions, decisions, decisions in an uncertain environment. Environmetrics, 34:e2767.
  13. Basis-function models in spatial statistics. Annual Review of Statistics and its Applications, 9:373–400.
  14. Statistics of extremes. Annual Review of Statistics and its Application, 2:203–235.
  15. Spatial extremes. In Gelfand, A. E., Fuentes, M., Hoeting, J. A., and Smith, R. L., editors, Handbook of Environmental and Ecological Statistics, pages 711–744. Chapman & Hall/CRC Press, Boca Raton, FL.
  16. Diggle, P. (2013). Statistical Analysis of Spatial and Spatio-Temporal Point Patterns. Chapman & Hall/CRC, New York, NY, 3rd edition.
  17. The unreasonable effectiveness of convolutional neural networks in population genetic inference. Molecular Biology and Evolution, 36:220–238.
  18. Cross-covariance functions for multivariate geostatistics. Statistical Science, 30:147–163.
  19. Multivariate max-stable spatial processes. Biometrika, 102:215–230.
  20. Fast covariance parameter estimation of spatial Gaussian process models using neural networks. Stat, 10:e382.
  21. Neural message passing for quantum chemistry. In Precup, D. and Teh, Y. W., editors, Proceedings of the 34th International Conference on Machine Learning, volume 70 of Proceedings of Machine Learning Research, pages 1263–1272. PMLR.
  22. Matérn cross-covariance functions for multivariate random fields. Journal of the American Statistical Association, 105:1167–1177.
  23. Deep Learning. MIT Press, Cambridge, MA.
  24. Understanding pooling in graph neural networks. IEEE Transactions on Neural Networks and Learning Systems, pages 1–11.
  25. Guinness, J. (2018). Permutation and grouping methods for sharpening Gaussian process approximations. Technometrics, 60:415–429.
  26. Haas, T. C. (1990a). Kriging and automated variogram modeling within a moving window. Atmospheric Environment, 24:1759–1769.
  27. Haas, T. C. (1990b). Lognormal and moving window methods of estimating acid deposition. Journal of the American Statistical Association, 85:950–963.
  28. Universal approximation of symmetric and anti-symmetric functions. Communications in Mathematical Sciences, 20:1397–1408.
  29. Multilayer feedforward networks are universal approximators. Neural Networks, 2:359–366.
  30. Huser, R. (2013). Statistical Modeling and Inference for Spatio-Temporal Extremes. PhD thesis, Swiss Federal Institute of Technology, Lausanne, Switzerland.
  31. Full likelihood inference for max-stable data. Stat, 8:e218.
  32. Advances in statistical modeling of spatial extremes. Wiley Interdisciplinary Reviews: Computational Statistics, 14:e1537.
  33. Statistical Analysis and Modelling of Spatial Point Patterns. Wiley, New York, NY.
  34. Learning summary statistic for approximate Bayesian computation via deep neural network. Statistica Sinica, 27:1595–1618.
  35. Theory of Point Estimation. Springer, New York, NY, 2nd edition.
  36. Neural networks for parameter estimation in intractable models. Computational Statistics & Data Analysis, 185:107762.
  37. Lucibello, C. (2021). GraphNeuralNetworks.jl: a geometric deep learning library for the Julia programming language.
  38. Exponential Random Graph Models for Social Networks: Theory, Methods, and Applications. Cambridge University Press, Cambridge, England.
  39. Rethinking pooling in graph neural networks. In Advances in Neural Information Processing Systems.
  40. Statistical Inference and Simulation for Spatial Point Processes. Chapman & Hall/CRC, Boca Raton, FL.
  41. Universal readout for graph convolutional neural networks. In 2019 International Joint Conference on Neural Networks (IJCNN), pages 1–7.
  42. Likelihood-free inference with generative neural networks via scoring rule minimization. arXiv:2205.15784.
  43. Likelihood-based inference for max-stable processes. Journal of the American Statistical Association, 105:263–277.
  44. BayesFlow: Learning complex stochastic models with invertible neural networks. IEEE Transactions on Neural Networks and Learning Systems, 33:1452–1466.
  45. Fast parameter estimation of generalized extreme value distribution using neural networks. arXiv:2305.04341v1.
  46. Neural Bayes estimators for censored inference with peaks-over-threshold models. arXiv:2306.15642.
  47. Robert, C. P. (2007). The Bayesian Choice. Springer, New York, NY, 2nd edition.
  48. An introduction to exponential random graph (p*)superscript𝑝(p^{*})( italic_p start_POSTSUPERSCRIPT * end_POSTSUPERSCRIPT ) models for social networks. Social Networks, 29:173–191.
  49. Parameter estimation with dense and convolutional neural networks applied to the FitzHugh-Nagumo ODE. In Bruna, J., Hesthaven, J., and Zdeborova, L., editors, Proceedings of the 2nd Annual Conference on Mathematical and Scientific Machine Learning, volume 145 of Proceedings of Machine Learning Research, pages 1–28. PMLR.
  50. Likelihood-free parameter estimation with neural Bayes estimators. The American Statistician, to appear, arXiv:2208.12942.
  51. Tapered composite likelihood for spatial max-stable models. Spatial Statistics, 8:86–103.
  52. Schlather, M. (2002). Models for stationary max-stable random fields. Extremes, 5:33–44.
  53. Forecasting west nile virus with graph neural networks: Harnessing spatial dependence in irregularly sampled geospatial data. arXiv:2212.11367v1.
  54. Universal approximation of functions on sets. Journal of Machine Learning Research, 23:1–56.
  55. Neural likelihood surfaces for spatial processes with computationally intensive or intractable likelihoods. arXiv:2305.04634.
  56. A comprehensive survey on graph neural networks. IEEE Transactions on Neural Networks and Learning Systems, 32:4–24.
  57. Deep sets. In Guyon, I., Luxburg, U. V., Bengio, S., Wallach, H., Fergus, R., Vishwanathan, S., and Garnett, R., editors, Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc.
  58. Multi-scale process modelling and distributed computation for spatial data. Statistics and Computing, 30:1609–1627.
  59. Deep integro-difference equation models for spatio-temporal forecasting. Spatial Statistics, 37:100408.
  60. Neural networks for geospatial data. arXiv:2304.09157v1.
  61. Graph convolutional networks: A comprehensive review. Computational Social Networks, 6:1–23.
  62. Zhou, D. (2018). Universality of deep convolutional neural networks. Applied and Computational Harmonic Analysis, 48:787–794.
  63. Graph neural networks: A review of methods and applications. AI Open, 1:57–81.
Citations (11)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com
X Twitter Logo Streamline Icon: https://streamlinehq.com